Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yuval is right: Human Alignment is necessary for AI Alignment.
So far in huma…
ytc_UgzL9HzXD…
G
we have to re define social contracts altogether from what's money, to jobs, to …
ytc_UgxZbjtN2…
G
Fail safes need to be built into AI to allow us to always control, guardrails et…
ytc_UgxSTW5_4…
G
They are wiring our brains ai knows our thoughts when they are impure we get the…
ytc_UgwZ-KnVD…
G
I appreciate AICarma's automated insights; they keep my brand relevant in a rapi…
ytc_UgzFXwmn5…
G
those same people argue that it makes those who got deepfaked money which i dont…
ytc_UgwR5zSkQ…
G
He should NEITHER give his name, NOR cooperate in ANY way whatsoever!
We did NO…
ytc_Ugxwdm0WC…
G
I practice Buddhist teachings and they are full of love and compassion, I have l…
ytc_Ugwt8_Glr…
Comment
I'm learning AI to work, cos it's what many clients ask. I don't like it, it's even a problem when you really think about copyright, but at the same time I got to eat.
For images and videos (my field), I don't even think is sustainable at the moment or in the near future. The truth is that this is not the problem: look at every company Elon Musk own, they don't make real profit, some are virtually failed like solar city, but as long as the Musk brand stands, every company is fine. We need something big to really understand the cost of AI and the real impact, until than it's not going to be a problem if these companies can make deals to build a nuclear power plant just for their AI centers.
youtube
Viral AI Reaction
2025-03-31T10:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxaMYjbBI56yQuB1ax4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwWYST3p5pGC900we54AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyKm48S2vKwYwmdhNd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw3Fkl6QLm7NpoNvqZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyTF3BwGPK8nl5ew0x4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwrxiYfzbhvzDxflft4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwNIYAi30h7mj8oQml4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgywY4zKh96j-VhEMKp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwzfjMX2ySPWEp2x0d4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy7EvCzfNwUp_NgIo14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"}
]