Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If the point is to make the American state loose the AI race and the dream of bi…
rdc_m9gwwc9
G
This world has been dualistic in it’s nature, who knows there could be world of …
ytc_UgwSpyikw…
G
Robots demanding rights? Unplug them.
A robot couldn't actually feel, it's just …
ytc_Ugw7dI6Vi…
G
Google has had a quantum computer which has been leaked to show self awareness, …
ytc_UgzGeqJtN…
G
I can’t wait to see the AI bro video complaining about how “poisoning art to mes…
ytc_UgzOkmL5k…
G
Gates failed at justifying human jobs lost around the 3 min mark. You could tell…
ytc_UgwLZeXDx…
G
ChatGPT is a program that says things that sound like things said by humans. I d…
ytc_UgxiAF66E…
G
After all a machine learning algorithm is just a big mathematical functions so i…
ytc_UgzRs2CZ8…
Comment
Even when you have an autopilot that is safer than humans, that actually makes it WORSE. The human would gain too much trust for the automation, allow themself to become disengaged, and be unprepared to take over once it does fail. No advertising needed.
youtube
AI Harm Incident
2025-08-15T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwLkAbxotBibd7cxp54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyRHf6ENyLqEP_dRet4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzy4ocmWdhh9fFKdxR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxBnLjIlH9s9FPB78t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx-LI2OIE09ehGIb3Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyd9Glq6Bt3SkTR8e94AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxbQJn5bPEhDv9gFzx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzIodfh6WzQyQkUJ-h4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyuGm2txsj8064zUPp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzNKyX1kYaY-Epe-FB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}
]