Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
this is dangerous the way the robot was able to manipulate the reporter so easil…
ytc_Ugx_It3w7…
G
Turns out people find meaning in work and will continue to do so regardless of A…
ytc_UgyF1JMw6…
G
@ChrisHillASMR😇 So if the user is nice so is the AI. Im a software dev who uses…
ytr_UgyMQG7MX…
G
If ever that happened, the only available jobs would be Scientists and Philosoph…
ytc_UgxvPbj2Z…
G
Also, amazed by the sheer amount of obviously AI-generated bot comments agreeing…
ytr_UgwGIm2DK…
G
My bet is on small, specialized LLMs that can run locally.
The big cloud stuff …
rdc_n7u8i1p
G
Let the software development be taken care completely by AI agents and we sit an…
ytc_UgwRVRAn0…
G
I am glad that you put this out. I love art, but im not planing on doin it full …
ytc_Ugzcz4Lto…
Comment
Self driving cars are a bad idea. Genuinely stupid. And there’s better alternatives! Let’s just invest in safe, clean, reliable public transportation. It’s tried and true, and the only reason we don’t have it in America is so that car companies can be more profitable. It’s asinine. Capitalism ruins everything it touches.
youtube
AI Harm Incident
2025-08-17T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyUvURAXPpt_LnbY6B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx9mnxS1OupQtXq9bF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxtiWOqFpInUS9L3PB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxV9xQvtQpFpyioxOl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxt374a4jhPLUocwwp4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzOQxNYoBpW0ClLqgF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwocRcvg5U9DkT4FK94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgynGXyWljYnCbeX9EN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgybR10bTgx_lzaRWhV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugy7L8oGz1H-x8rtS9R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]