Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@polishchampyou want to keep your videos as real as possible and then support ai…
ytr_Ugz5Un5TE…
G
I will buy brand new system with cuda and use offline models for my AI needs. Mo…
ytc_UgzHu5vfA…
G
P.S. I find discussion around AI art is strongly biased against the AI rather th…
ytr_UgyTZuwI0…
G
Pretty sure that the mental illness was already there way before any open AI exi…
ytc_Ugyp3_vtC…
G
Well he really missed it and never answered the question of what happens when t…
ytc_UgwVzA9p6…
G
Nobody’s whining or crying, just pointing out that the “artist” in AI “art” is t…
ytr_UgxCxbe5C…
G
Why yes, legal frameworks need to adapt to AI: to stop this rampant plagiarism a…
ytc_UgxmTF7zP…
G
I would love to send you my conversation with ChatGPT about this. It made my day…
ytc_UgwDo8GAX…
Comment
The parents want to sue Sam Altman and open A.I. saying it’s a dangerous and new experimental product .
Open A.I. will argue that it displayed the suicide hotline number multiple times, and they their fine print user agreement dissolves them of all responsibility from the suicide of this boy .
The most disturbing part of this story is that people trust A.I. or big tech for anything.
The only thing I use A.I. for is to get a lasagna recipe; certainly NOT to plan major or problematic life decisions or personal issues .
A.I. is just a tool, but not a god, and certainly doesn’t replace good parenting .
I wonder what happens when the A.I. safeguards fail and a terrorist builds a bomb because whoopsie the safeguards fail sometimes .
That’s like a bus driver saying whoops sorry kids sometimes I drive after a fifth of tequila . Fasten your seatbelts .
youtube
AI Harm Incident
2025-08-27T13:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyXNcEQjyzK3HDMy654AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwENWZwh5zAS5qSj_R4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzQ5etouCcEHevNNb54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxTkkvtPnsKUG5MVzp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"disapproval"},
{"id":"ytc_Ugwt7gf3FxGogMByLGR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwoOK-BF6cm6HzKXM94AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz2f8YXSbBp4CDAFl94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyXn2yPViICR0yWz-l4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyEvrHNgn4qcOC_Kgl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz_2S6da06hc3UVmUB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]