Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Thank you mentioning Andrew Yang. Dude been predicting automation and AI before…
ytc_Ugxd71LKd…
G
They should demand a watermark symbol on ai content. So everyone can tell easily…
ytc_UgwXOBJBG…
G
It’s unavoidable, our responsibility now is to avoid the world from the movie El…
ytc_UgzBglAxw…
G
Yeah, I agree, it's over. Now can all the grifters switch to something else plea…
ytc_Ugy2HFb91…
G
I'd say it's as good as guaranteed in 5 years time. If it somehow didn't pan out…
rdc_m9o3qia
G
People would do well to listen to people like Roman and Eli. All the people push…
ytc_Ugwx2Zz34…
G
I'd laugh if someone recorded themselves actually replicating that AI masterpiec…
ytc_UgyT_8Hl2…
G
Self-driving cars don't have to be perfect. They just have to be better than hu…
ytc_Ugj2jxTP4…
Comment
This is a tragic situation but it's not AI's fault. It's just a machine. Parents raise their kids on phones and tablets and then they blame technology when something as tragic as this happens. Why weren't they watching what their child was doing on their phone? Now they are using their sons death to cash in. Also why did the parent have a gun available so easily tht their own child could use it? People would blame anything than themselves. Whatever lets them sleep at night i guess. Horrible horrible tragedy. The only victim here is the child.
youtube
AI Harm Incident
2025-12-08T00:5…
♥ 7
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzUFY1PxiXJubtkKDR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyzi5d0R-YSh5h09E94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugy_tHYl40BkqZzrA1J4AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxyR1ZXkZoGbtLXqXB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw_YMTlGJ7dJg_83op4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwj-2jtxtrz-e_FpJh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyVRcl-vNfZNXzFDqZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwV_NAFKx0bh5_0Fv94AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugx1bJ5M2b-dGSbLtIl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwS6C381QrYH-7BMVF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}
]