Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I suppose that it's not robot against human but an edited version of a human aga…
ytc_UgxTQqfMt…
G
But AI could manipulate people to do it's bidding before anyone even realizes an…
ytr_UgxCOML_y…
G
Wow, so may unasked questions. Jake, you should have asked him what he thinks of…
ytc_UgwRqSyWE…
G
Their just salty someone screwed up their get rich quick sceme, hey ai bros we'r…
ytc_UgybAunsq…
G
HELL NO. ai is still bad environmentally, do the research yourself, find inspi…
ytr_UgxtP4mSg…
G
Has anyone in the smart zone way up above the normal people of the world, ever s…
ytc_Ugx1lfDZH…
G
AI adoption in consulting requires proactive leadership that supports a people-f…
rdc_nmoeavg
G
I can't help but wonder if the cost savings of ai will quickly be flipped into h…
ytc_UgzTELtWB…
Comment
Sorry Dr. Emu, but getting started with your video, to be clear, it is NOT asking AI a question that cooked brain cells. That may not be clear. Oh no, AJ. Need to study a LOT more than that. If only you'd had a TV... Thanks again for using no AI.
youtube
AI Harm Incident
2026-04-04T05:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugw94st37z9u5eKoSGd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyPf8Y29nf3fcgFDrl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwGTj0TuvB0PEI3pGB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugzgjyg4b5klMLmPv194AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz4HsBjgaRcV_MH7tZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzekvc7UN9OPZcNA6J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyJFcYQHpdYH20IrDF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzJ1WcQ8YQdi-hkgZh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwUXTu8j6i2NshJli14AaABAg","responsibility":"user","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzGWlhS1HipDO4DsgF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"resignation"}
]