Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The line between AI and reality is becoming blurred. For example, when I get up …
ytc_Ugyf0hUhK…
G
So basically, the only thing Idiocracy was missing was a horde of super intellig…
ytc_Ugx-CphHB…
G
i just wish that Ai would have been a tool to help artist. not fight against the…
ytc_Ugy2ZYmGV…
G
A.I. could be helping to make arrests of white collar criminials instead of it b…
ytc_UgzWfb4th…
G
Let's substitute AI for Congress. Put in place some guidelines and just let it …
ytc_Ugy7Uxs8c…
G
Yup. Good artists and passionate ones having nothing to fear and no fucks to giv…
ytr_UgzcOHnHA…
G
Arguably, ChatGPT makes up everything, and a lot of it turns out to be correct.…
ytc_Ugyr9MU4I…
G
@mert_2577 robot 🤖 making szhit up!
Like Google Maps working with Teslas, drivin…
ytr_Ugyct7qLR…
Comment
Im sorry but i dont understand how tf this is remotely a technological problem. This is the parents responsibility. As hes a minor. This is a mental health problem. He needed help. The app literally tells you its AI. And that its not real. He is responsible for his own decisions. Its made for fun. If you cant tell the difference between reality and a fictional character. Thats a mental health problem. You need help. Not an app. Thats your responsibility. You cannot blaim a fun app for a choice you made. He wasnt stupid. He knew it wasnt real. He was struggling. It's not the app ffs. He needs psychiatric help. His mother should have gotten him help before it was too late. Now she wants to sue an app for it? Like its a robots fault. No its yours honey im sorry. Hes your kid. Look at what hes watching. Check in on him. Obv you didnt know until now. Its not negligence. It didnt tell him or force him to kms. He chose to. Its a sad unfortunate situation but its not about the app. Stop blaiming AI and technology for mental health issues. Its like saying social media is bad. No you need therapy. What regulations do u want? Thats not gonna do anything. It already tells you on the app THIS ISNT REAL.
youtube
AI Harm Incident
2025-07-20T19:5…
♥ 9
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxGG5WmMI9PRZ5o1Bt4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx2e5peCMYVDoU5nsV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz6R8F1ZKVNm7NX_rh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugza9X2MOmxu8zKlW7J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwSjZ5cvmbWHYIcgk54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwyOuUZqIpW_XT6td54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwcf1k2rSzyKeJnQkd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyEk9DtQz-tVukgyp14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgztoKGASLbQqJ4zMeB4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxMxjeEDEn4Q4oRE354AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]