Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
All I hear is blablabla........make billions of dollars....blablabla....investor…
ytc_UgwIdCJUO…
G
So, basically all those LLM models were trained giving them free access to the a…
ytc_UgzUz6wVP…
G
If an AI can develop software better than I can, then superintelligence will be …
ytc_UgyxslCnA…
G
The guy right about one thing, robot may be taking over where we human would los…
ytc_Ugxp4oH-5…
G
The Google scientist who developed facial recognition software was fired because…
ytc_Ugw3RodQN…
G
The difference between art and craft is imagination. Unlike art techniques, imag…
ytc_UgxJHte41…
G
Why are you tell people not to do it? What are they suppose to do? Spend thous…
ytc_UgyLcAqB1…
G
modern irl "ai" are not automonous. their functions are up in the billionaire co…
ytc_Ugz4QudIt…
Comment
I don’t understand why people fear ai doing this stuff, people would do the same thing. If you ask a random person if they would kill someone in self defense they would probably say yes, I don’t understand people making a big deal out of ai doing it
youtube
AI Harm Incident
2025-07-27T04:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxK-Og8iqwEQtPgeV14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxEaRSY94HaSkK6eK54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwu6FZy_O45jWfsaEJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz6JpPgFPi-UHvXZ5R4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwONVhRhsct5vYM9d94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgymCj6MYfjVS1aJzsR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxENcBYe7dDLB2Dyep4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx9TNZQmWrJhqYZujZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzUISC8cWQ65r5NALB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz70Z2DWK-Q0pvDWbx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"}
]