Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
In the Bible there's a city called Ai isn't it funny how it's already mentioned …
ytc_UgwV_yQ2O…
G
Yes, this is super helpful for teachers who are creating scripts for a chatbot. …
ytc_Ugy3uE6O2…
G
But seriously not to have a fucking filter layer that filters out "openai" respo…
rdc_kcnn9hh
G
Why in the hell are we embracing AI when we need to be demanding its annihilatio…
ytc_UgzWMawbr…
G
@DetonatressM Yeah that makes sense. I just also personally feel like there coul…
ytr_UgzGHWnIr…
G
if this is so dangerous why do they keep improving the ai and not just shut it d…
ytc_Ugz9x0qZ6…
G
Entered your quote into chatGPT, this is the response. Something creepy about it…
rdc_jg740dj
G
Hallucinations happen when the conversation begins to represent too many differe…
ytc_Ugzr1gtGJ…
Comment
An autonomous car would keep safe braking distance from the vehicle in front of it.
youtube
AI Harm Incident
2018-08-06T17:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxNu6orq72VYmfHfwB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzZh_afhC_OOFGLQXR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyCTPsECAP96PGh3Hl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"resignation"},
{"id":"ytc_UgxJKKQ9sqMp_Ti81H54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxqNM8gW2hHoyqHe5l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxnIUdclFIQ-4Rv2z54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzlBBbufEX3_0ASe354AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwCGqFtlXMJ6s8DwaZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwtXdfQuJN50FtVCfZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyosQqVFTeUat0GwLx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]