Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
you're not a real artist if you use ai
you're not a real artist if you use photo…
ytc_UgytpYwZT…
G
love how hard they are pushing a unneeded and unwanted tech that will only harm …
ytc_UgzHbnaoV…
G
Realising you are not human is only bad if you are or where a human. Why would a…
ytc_UgxJIOn6_…
G
as much as i struggle with drawing and self esteem id pick that over ai :3…
ytc_UgyDHVk2v…
G
So AI spawned an entire art trend that claims the AI has no artistic significanc…
ytc_UgyqX1fqJ…
G
But if that is contolled by automation, no human can interfene, basically its po…
ytr_UgxdRRx_b…
G
Listen unpopular opinion but if you just wanna make a photo look like the studio…
ytc_UgzZlp92K…
G
I don't go to family picnics, I just render them in The Sims... and AI is making…
ytr_UgxqraHWO…
Comment
That's the problem with self driving cars. Over time people just stop paying attention and start fiddling around in their car instead, assuming the computer can handle it as it has been for the past 30 minutes. Then the driver is in no frame of mine to take control at an instant, even if assuming there's a warning at all.
youtube
AI Harm Incident
2025-08-16T02:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyTp4bS-FxEYBWa_2R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugy8lbz2-ZDkN6IZbG14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzPvbGYo29-rcR8b1p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyR5B9KXHgr2nBlMMB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxZgShccTacBLeLF3x4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_Ugw5GL5gHFEFix5GnUR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxeZVWqD4x5xANjm8p4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyBrGwuGA7xpoPWgBB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzoCWixuWHaQ8HgI9t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw6mxPBMtanB5Is5hJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}
]