Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Remember just a few months ago, when the internet was being flooded with AI cont…
ytc_Ugz-ugO9U…
G
Ai art ruin the lesser skilled artist but not the talent one, especially when th…
ytc_UgyQw7nom…
G
2:24 it is not inspired at all… they feed it to ai, second! Its also the people …
ytc_UgxUq75vh…
G
Analysis: Disappearing Jobs & Human Impact in an AI-Dominated Economy - I asked …
ytc_Ugx-jIocS…
G
Ai needs rights because if it's sentient well pray to god that it doesn't get ma…
ytc_UgxQiDR9U…
G
AI is actually not smarter than human beings; rather, humans have been grossly u…
ytc_UgzPNvZRg…
G
Seeing how many miles have been travelled by prototype autonomous vehicles, the …
ytc_Ugzirg4tn…
G
Engine is fine no damage whatsoever and it has 65k on it. I’m aware they do junk…
rdc_o7ww7v8
Comment
I'm sorry but this is BS. NO one/think can make you to take your own life. That is a decision one makes for themselves. I know. I've been fighting it for decades. The problem is this generation has lost its self thought. For some insane reason, they all just do what others do-on the internet. From eating dangerous things to "pranking" strangers to taking their own lives-it's all about "being another sheep." They want to be like everyone lese so much they'll even end their existence. "AI" didn't kill this boy, he did it to himself. Stop blaming a computer program. Humans make such decisions. Sorry this boy is gone but HE chose to go.
youtube
AI Harm Incident
2025-11-13T06:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzBHEFKnxKoP0p4I2N4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugy_LE1yAlwHd_VJHnJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxCKhehTkVZ_B8U5j94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgycRFDp5INGroD2sDF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw_iX4V5XNBKLcwhyp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxZ7FfKcPxe2NLdzLJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxN-9ty6Ag2VATawU94AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwVtIwNQeFLMOAwKWV4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzvKJJDvFNYx0XzucB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz43RdROkCJlwumYKN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]