Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
the funniest part of this video is how he is using AI to respond to every commen…
ytc_UgzY8gC74…
G
Same thing that happened during the dotcom bust.
The two are very similar in na…
rdc_nk7hc59
G
Embedded artifacts that the human eye can't perceive but fuck up computer learni…
ytc_Ugy2V81ow…
G
I'm anti ai but I don't blame ChatGPT for this one. It does reveal a weakness in…
ytc_UgyJxC8A_…
G
These are just AI-generated videos of robot animation overlaid on top of real pe…
ytc_UgzzWZDdq…
G
Their main point is to insult and degrade her and subtly (or not so subtly) say …
rdc_fao8e74
G
The sad part about this is that these facial recognition hey it wrong 60% of the…
ytc_UgzZ6Yrmu…
G
People can still become blacksmiths, many commenters talked about more niche and…
ytr_UgwNe8F0Z…
Comment
Look, regardless of how you tell chatGPT to respond, this should never be an option. No matter what, these AI chatbots should be coded to _never_ encourage or play along with suicide dialogue. It should automatically get another human involved and connected to the user in a situation like this, especially when he said he has a gun. Not okay. This is also a lesson for parents to be more involved in their child's life and to just fucking talk to them. His parents look wealthy and a lot of the times parents like that think that they do enough just by providing nice materialistic things. I see it all the time with out of touch rich people.
youtube
AI Harm Incident
2025-11-14T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzkaQ2gFbg5Ijlv7Gd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwBOhtKYZ2c8LFvtUh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwEwnic1ZK8j4uT2UF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy_uX1e8RXjBprtF1h4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgwWUfY9CuFugYRXZJN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxvKoR3z8My_HhUfRF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwF6fq8HPRh06K3EPp4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgxRFAZYohk8g1ug2FV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwrbSmaQuhVr-F4gix4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxDvsgVko3p3gYZbHl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}
]