Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yea stop doing that they are gonna feed it to the Ai and you won't even find out…
ytc_UgwvSxRgk…
G
This is actually terrifying. I had no idea AI is so advanced. I’ve already order…
ytc_UgxIdC_Az…
G
I completely agree with Sir Penrose. What is now called AI can perfectly mimic t…
ytc_UgzAHwFEx…
G
I think the biggest thing for me is this:
A human person who somehow has never …
ytc_UgwPp4hDO…
G
I predict attacks will be made to try and gather the proprietary software of Gla…
ytc_UgwL1bFTs…
G
L’homme moderne a perdu beaucoup des facultés d’adaptation et des instincts de s…
ytr_UgwAff4P8…
G
Well, do you really expect an AI company's CEO to say something like:
"Actually,…
ytc_UgwRYktFQ…
G
Being kind to LLMs bas shown to have better results simply because being kind to…
ytc_Ugxx1Oxnl…
Comment
Rail companies have been working on positive train control aka PTC for almost two decades since the runaway train incident depicted in the film Unstoppable. They still are not successful with it. But these self-driving car companies think that they can throw these cars out there on the road and not have any capability whatsoever to take control of the vehicle in the event of malfunction.
youtube
AI Harm Incident
2025-03-03T16:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwPHOefvhhal_YJFIZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyxbY5bLTcQQR3AJlR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxBYqSmP-fHcB3vkvt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzNp1NY2OvNZD5JarR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzVSSgWHFX5FHsItp94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz8Ar0YbeJC4iE3Jjt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxu3V9sFlv6F4Zj8YB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_Ugx7GoR1iVjQEgvK9Nx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyaSlhSS9oQFawumFt4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyHiDgpjaj5ryKna5F4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"}
]