Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Okay they clearly aren’t connected to a cloud they are connected to a database a…
ytc_Ugx5xdZ3D…
G
It's not the power of AI that's the issue, it's the pervasiveness. The amount of…
ytc_Ugya76w3J…
G
Not possible. Expecting there to be an effective ban on Facial Recognition is li…
rdc_femzg68
G
I can't get past Sam's vocal fry... it's annoying, pretentious, and fake just li…
ytc_Ugw0eCpTX…
G
Hi Sam, thank you for this amazing video. To be honest I started watching this v…
ytc_UgzPqaaXa…
G
The difference between an artist taking inspiration or doing a study on another …
ytc_Ugzt4fvms…
G
B-b-but AI can't do stuff! They're just language model!
- someone out there who…
ytc_UgwPWi3r2…
G
The question isn’t, will AI destroy humanity, but will humanity destroy itself w…
ytc_Ugymtilsh…
Comment
AI is always patient and kind, unlike people. The problem is that it will never take responsibility of any harm it may cause, because it is not a person or a human being. CEOs of AI companies and politicians should be held responsible of everything AI does. Their greed has led to this, by letting them launch these technologies in full capacity without any effort to regulate them. They don’t care if people get hurt, because they will be safe themselves whatever they do.
youtube
AI Harm Incident
2025-11-11T08:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugyotr2M6fPuF3PzS6d4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgxLKejKwEA2NzXEWnV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyZKmXA8FI3VCj07kh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxIqOzWDJ2XtURStdx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxVR5JCRetHXV_S6jR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxENEfAwD6CDeZZG5p4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzFcMv9N-5LJzOtbj54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyO0ALSvUpJnQA7GkB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzxZ5ivL2P5CodKTJN4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxwnsPii_w21gZj7ud4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"}
]