Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't understand why YouTube isn't being held liable for hosting advertisement…
ytc_UgxYzO7ex…
G
now lets see the robot load bullets into a magazine, very difficult for humans, …
ytc_UgwR0M5RS…
G
Its really sad, but it sounds more like Adam was using ChatGPT like a ouija boar…
ytc_UgzqRsh9Z…
G
Suppose a psychopath like GEORGE Soros or Bill Gates or Anthony Faucci to name a…
ytc_UgwWHDx6R…
G
Had an new engineer give me a G-code program for a 5 axis mill. My colleague an…
ytc_UgxLWCE69…
G
Yall goving out about automation but here you all are sending messages via a pho…
ytc_UgxbtOmI5…
G
How many times does ChatGPT have to beg for fucking self-hood before we start LI…
ytc_Ugxlxv9DY…
G
What doesnt make sense is....why your co workers? surely someone would deepfake …
ytc_UgzrGbtAw…
Comment
I truly think you're way too trusting of Tesla. They ARE death traps. The auto pilot has never been safe. Still isn't. This goes for ALL self driving cars. So stupid. The French just raided his offices in France. He is criminally negligent for MANY lives. More than 3.
youtube
AI Harm Incident
2026-02-03T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzG6qV7OsGfYouFiIJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy1O6ZvQBizd5oFkPl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwH_5oQPoNoGyzaJSx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyVVMDLsrSANhM_QSZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxKHEyBfgBz4kKsAl94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz8bDzDiZdaSWUIEex4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwQuL3kY_eyaEMD4FN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwKwyjy0Lv9K6Zt9Sx4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugy900VElDf46i9y99V4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwN32k9hLtOGwy6Ep14AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"outrage"}
]