Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Im sorry but I don’t think the ai is wrong homie. I think you are just too afrai…
ytc_UgzA0Gg2e…
G
most intelligent cogent cohesive look at what AI is why its dumb and who is re…
ytc_UgxHFB2BF…
G
People seem to forget in the end it's still a computer program that someone CONT…
ytc_UgwYInf0a…
G
Once a technology has been introduced, the only way to break people from using …
ytc_Ugxx8OVMt…
G
How ironic can it be when a person who nobody has ever heard of claims to be a p…
ytc_UgyCYwOTi…
G
I'm not scared of AI or robots. I carry around a ten pound sledgehammer, that I …
ytc_UgzlMuSwI…
G
I don't think there is any good use for generative ai and I'm fucking tired of h…
ytc_UgwlBOPb-…
G
Allen's Copyright was rejected because Midjourney owns the Copyright. If you wan…
ytc_UgwQtwwKG…
Comment
There are fatalities due to human errors everyday. Let the data lead us to decision making - If self driving cars prove themselves to be safer than humans, we ought to ban humans from driving, because, in aggregate, that would save lives!
youtube
AI Harm Incident
2026-01-05T04:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgzHDJI2U3iIk2sOd8V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyJnSGtqHrLj2cO4WF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzfaqVQa66IxCt07Yp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzUw8MKUjOByWRUVDl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzQXlJ90OvoBFImBUd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzJpDsZ8p2B2ZMwaVt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwwJNyG4HoqyYM-3DZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx5TKcAehXZQ9n-Xrh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx0zB5KA2bF_IxGJRZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwBSLTej1ErkXkKwl54AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}]