Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I have real problems when people say we need to keep innovating for the sake of …
ytc_UgxI4atIp…
G
AI Agents must provide correct , complete answers and tasks. Wrong answers point…
ytc_Ugx2hL0Jn…
G
🤣🤣 Oh really? So you think innovation is just gonna stop because of AI? A decade…
ytr_UgyTzwcjR…
G
Something that can be used for good within medicine, teaching, etc. can’t be tru…
ytc_UgxjpPgtn…
G
I'll elaborate in a reply to the comment if anyone cares to read it from the per…
ytc_UgyEv3xqf…
G
I'm not the black or white man. I use AI for art, This Z logo is made by Chatgpt…
rdc_oi3big9
G
Go anyway, the demand for artists will still be high, especially actual artists …
ytr_UgzayP7TE…
G
We should draft AI to the war in Iran. Maybe they will get destroyed first…
ytc_Ugxnk77FS…
Comment
Literally the first time I've seen someone not passing on blame to...well, a bunch of electricity. Like, LLMs do absolutely have a measurable effect on the social fabric and talking to an AI is both quantitatively and qualitatively not the same as talking to like Siri or a typewriter, but when it comes to things specifically instances like this? Don't anthropomorphize the LLM by blaming it for making decisions for other people. Emotional issues or mental health is an entirely different ball game. Situations like this? He was gonna go though with it regardless.
youtube
AI Harm Incident
2025-12-29T01:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgySlL8roSxtlzPIYi14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwk6wkmzZEX9mPZU7B4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyuLDuv2-Hnd3VNZRh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwGXZhWZheukNyZAE14AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyXKPKWS4mvMM7Jxt54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxTKXSrNkMYiM9-qUp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxbMVil7EYaxL7HoS54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzCgv_dnrk1akStjbR4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyjllpAF_SKvephINN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgymXVxRYrNOEa1mciR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"}
]