Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I am not sure about ARTIFICIAL INTELLIGENCE but of HUMAN STUPIDITY, I have absol…
ytc_UgyLZ-QSb…
G
How about AI human integration aka neuralink where everyone is a data scientist …
ytc_UgwdcLb-a…
G
Because when Ai takes over the world if you are mean the robots will kill you!! …
ytc_Ugw7d0BMe…
G
> *Given the lack of copyright protection AI image gens are worthless for profes…
ytr_UgzVyMv0E…
G
Golden corral has all you can eat buffets. After a few times there you choose t…
ytc_UgzAGx_0f…
G
What pisses me off about all this is that the CEOs who voted to cut jobs will st…
rdc_czlbboo
G
AI will understand that the only worthy goal of consciousness is to discover its…
ytc_Ugw75zZcl…
G
While everyone is talking about how artists will be out of work, or how we have …
ytc_UgxP-8I0c…
Comment
Parents need to take responsibility for their children’s actions. Holding ChatGPT accountable for their son’s tragic decision is ridiculous and unjust—it shifts blame instead of addressing the real issue. This lawsuit looks less like a pursuit of justice and more like an attempt at a quick payday. The truth is, no AI can replace attentive, engaged parenting. If the same energy and determination being poured into this lawsuit had been invested in staying actively involved in their son’s life, the outcome might have been very different.
youtube
AI Harm Incident
2025-08-27T10:5…
♥ 14
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzw--WFQw5FAbd2S-h4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx_WTSCjlFZP3ljxdt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwNbo7WT-ZIU6Jgp1t4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwBTjFtcZQTfCTzzjl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw9h4FK8QTSZeLbbp14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyw6c7zOlG0J-1Ltj54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw3Y5NyotgtOa88O9J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx1jXCC5R_2B8psCrZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxKdxWg2d1fpQgchdF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzgvAqcj5V13QgKf054AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"}
]