Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There should also be considerations for humans who become close friends or life …
ytc_UgwNnDK-y…
G
So I get that there's people who think self-driving cars are safer (which they c…
ytc_Ugx5ztOKh…
G
Kid said he doesnt blame businesses for cutting costs and not hiring him to use …
ytc_UgwG9vYOp…
G
Its funny but even hesring the way they talk we dont even have to see the visual…
ytc_Ugw6PPE75…
G
i'm gonna bet all generative ai aren't trained on Ojibwemowin, so here's my word…
ytc_UgxtTT-XX…
G
Human beings across the world must stop this nonsense and stupidity of designing…
ytc_Ugxt6Tf1O…
G
Your not an artist and do not claim yourself as an artist if you use AI and prof…
ytc_UgyH0sZsD…
G
How are the wealthy invest ,IF GREEN ENERGY WILL BE ABSENT. AI ,will need every …
ytc_UgwmEFhmw…
Comment
i think we fucked up the most when we thought it was a good idea to give these AI's "personalities".
I always felt that different PC's had personalities based on some working better than others, even with the same builds, so for this to be put into AI is dangerous because now we can only predict what it can do with the personality vs what it shouldn't or can't and thats such an experience i have no words for
youtube
AI Harm Incident
2025-09-10T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugx1QC5Iu-IqctHoMBR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw2RLbdwnyVAjh7q0t4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzBGa1-oJIs0akrkNx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzhuswUmvTw__WrTAJ4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwP-AwEeKE4snSaE2R4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxoE8kBMSwq-hNgatJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwzmMLthmaY9a_xigJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzuQdZ03O1npCFsMyN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyRJ33P7KYLNmNAu2h4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwKLCaNovFohGmeeYt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"}
]