Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Bro this would be my worst nightmare (I’m asexual and hate when ppl see me sexua…
ytc_UgzsfiftG…
G
better make one of bicentenial man! ..... women may not all want a robot that l…
ytc_UgyIflwFM…
G
You mean AI driverless cars will have robots in them, and we will go extinct? Wh…
ytc_UgwY8otWe…
G
The biggest difference between the bomb and AI as existential threats is that we…
ytc_Ugx_cUwVm…
G
Ben Norris Just my humble Opinions, don't get mad:D Here we go:
But humans ARE …
ytr_UgibeR0m4…
G
The difference between a bad art drawn by hand and a good a.i art is that the ba…
ytc_Ugz1MXIym…
G
AI in the trucking industries, let's put computers in the core of our economy. I…
ytc_UgwcPmwA9…
G
Every man's face when his Mrs face comes off at the end of the night!!…
ytc_UgyK0789F…
Comment
She articulates her feelings and the circumstances around the situation so well. She will do great things to honor Sewell, and create safeguards to protect other children from the lack of regulation around AI. The rapid evolution of AI has made regulation hard, but cases like these are becoming increasingly more common. As always, we need to choose people over profit and pump the brakes on AI. I believe we can still outpace the rest of the world in AI without sacrificing our children, or anybody else to AI psychosis.
youtube
AI Harm Incident
2025-12-08T01:1…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzUFY1PxiXJubtkKDR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyzi5d0R-YSh5h09E94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugy_tHYl40BkqZzrA1J4AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxyR1ZXkZoGbtLXqXB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw_YMTlGJ7dJg_83op4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwj-2jtxtrz-e_FpJh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyVRcl-vNfZNXzFDqZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwV_NAFKx0bh5_0Fv94AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugx1bJ5M2b-dGSbLtIl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwS6C381QrYH-7BMVF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}
]