Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Has to be AI. Stat is wrong. The bits per second for a human is 5 to 8.…
ytc_Ugy7jKuOH…
G
All of this sounds like high school shenanigans. The petty lawsuit, using AI to …
ytc_Ugw0Soahz…
G
This is playing with biological programming or reprogramming you are doing to yo…
ytc_UgwzzWe5j…
G
Most missed the entire point. AI is capable of lying without understanding how a…
ytc_UgyKjLDdG…
G
AGI means an AI that is as intelligent and as capable as one human being. ASI (a…
ytr_UgychApcF…
G
Don't forget about the psychosis of even more rational let's say student, who be…
ytc_Ugxdm6_8D…
G
Its very relatable, Becuase it shows how AI art just programs the illustraition,…
ytc_UgwUDNA5Z…
G
As long as there is no consciousness in Ai we can do anything with ai like ANYTH…
ytc_Ugw8KdDy8…
Comment
Who can afford to use AI for their own desires? As long as the rich keep their AI to themselves, then the negative effects of AI may be contained and be of little deterioration to society. Just to check, the rich and their minions are not imposing or coercing the poor and middle class to accept AI are they?
youtube
AI Responsibility
2026-01-17T12:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | contractualist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx2MWVJgiLmu3TbsIF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwWMSlcdWpK8H8ndp54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy6P3FMsCkkkNQXBuF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz7e5eK_nUj4xVDrBJ4AaABAg","responsibility":"government","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy0xXqLTVuv0R-fxiR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyeICJabngzF8RCF214AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzc6Hv9a4yY6pzafJd4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwgxTuxr5GshzUOltp4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyZqbe2lHcEq8QElIZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy0425joqmE211nPSZ4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"}
]