Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
According to the article, the AI images were based off a 12 year old version of …
rdc_lgmsi66
G
If we need to stop AI, then Elon better go bankrupt soon. That's how you stop em…
ytc_Ugx6yBrbK…
G
I think the question is wrongly asked. You're asking for where the sharp bounda…
rdc_icgpw4j
G
@samankucher5117 Bud, I’ve been on YouTube longer than you, I know how to preven…
ytr_UgyikVNDR…
G
I agree that artists should need to opt in. The software companies shouldn't be…
ytc_UgwiJKjJI…
G
AI is nothing new😊. For Flintstones, a motorized car is AI 🚗. For people who use…
ytc_UgyytvRqc…
G
Using AI to produce art is like throwing a bad dish into a microwave and bringin…
ytc_UgxfLwY9Y…
G
Jazza just put out a video clarifying his change to an anti-AI stance. If Shad …
ytc_UgzrLHd5j…
Comment
I know this feels strange and AI bots shouldn't replace human companionship. I've tried them, they are still not as good as chatting with a real friend, even when your friend isn't agreeable. Honestly, I'd rather people talk to AI chatbots in a time of need to improve their mood than allow their mental health to deteriorate and lead them to dark places. AI chatbots can significantly uplift mood in a short period and help people sooth themselves.
youtube
AI Harm Incident
2025-08-08T14:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyGiuYa0XSkEpuN-kd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy5RQExd5EeTzVrsj54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx7UFOEyzsaTkdUbMx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"sadness"},
{"id":"ytc_UgxutUEftfLYHHJYKRl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyZVYSs7cBQUDUnkIh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxhNbzB4Fnuu9piDBp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwDG6K8nhx0ms-qfV94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugx2dsGd_2OOlVRastB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw6NCi1sUa8hfXOTbR4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwXcS8Zr4PCt-azVIB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]