Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So in the end we become to dumb with no education. Ai takes over our planned acc…
ytc_UgyX6pYaw…
G
What should concern us is the asymmetry of progress. Human experts lose ground e…
ytc_UgzKM2ZKh…
G
Who killed Open AI’s young genius employee? 🙄 Watch Tucker Carlson’s interview w…
ytc_Ugyj7aiPs…
G
For the first problem, it probably won't be for the foreseeable future, as A.I. …
ytc_UgxeSDDTM…
G
Ai can only be AI if created by AI, If initially created by human coders then th…
ytc_Ugz110C_Q…
G
He cited two concerns: That a bad actor would use AI to destroy us or that AI wo…
ytc_Ugw0TvGXu…
G
The fact that I’m dogshit at art and my “art” is better than AI is very funny…
ytc_UgyvfHBVj…
G
As much as I don't like AI art, the art community came together and did somethin…
ytc_Ugylq6hx0…
Comment
At least this time the AI wasn't directly telling him to do it. Unlike some of the cases that lead to families or individuals being wiped out.
youtube
AI Harm Incident
2026-04-04T08:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugw94st37z9u5eKoSGd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyPf8Y29nf3fcgFDrl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwGTj0TuvB0PEI3pGB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugzgjyg4b5klMLmPv194AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz4HsBjgaRcV_MH7tZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzekvc7UN9OPZcNA6J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyJFcYQHpdYH20IrDF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzJ1WcQ8YQdi-hkgZh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwUXTu8j6i2NshJli14AaABAg","responsibility":"user","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzGWlhS1HipDO4DsgF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"resignation"}
]