Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@partlyawesome Robotaxi has been operating with no real issues. FSD is not ready…
ytr_UgyiLqvzU…
G
Anything a CEO can do an AI could do better! And with no end of year Bonus!! Te…
ytc_UgzFU7CEv…
G
Honestly I think that getting all the stupid people to kill themselves is an awe…
ytc_Ugw5lXMZn…
G
For sure the world knows SOMETHING is happening. I keep hearing about AI advance…
rdc_jhc9tf7
G
“to be able to show the world that I can also do this activity” But, that’s not …
ytc_Ugx49OtKm…
G
True, AI is supposed to be a companion not a replacement. Writers need to exist …
ytr_UgypJi_QM…
G
Governments....
AI should not be allowed to manage governments at any level, a…
ytc_Ugz16OmLh…
G
If AI is trained on human modeling by utilizing human-generated data.. then aren…
ytc_UgzJaWLQ1…
Comment
@TheGuardDuck Jealous implies wanting something you can't have. There are a billion free AI services and you can run things locally too. If anyone can access it with zero effort and still decides not to then maybe they actually have a principled stance against the thing instead of whatever fiction you'd prefer
youtube
AI Harm Incident
2025-08-03T05:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugxsu1BlJRRxPCydPwB4AaABAg.ALLl_oCSIcoALLvcLf0d_d","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytr_Ugxsu1BlJRRxPCydPwB4AaABAg.ALLl_oCSIcoALM57N3P_aG","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_Ugxg20T0lmqrOajf4RJ4AaABAg.ALL6AzsDk1dALLSWPB0_ww","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugzt4ODNqMeKXUatvzB4AaABAg.ALKoUp0Qg86ALQCO91G7Jl","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugzt4ODNqMeKXUatvzB4AaABAg.ALKoUp0Qg86ALSWJC4Unat","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugzt4ODNqMeKXUatvzB4AaABAg.ALKoUp0Qg86ALTbRClH9T-","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytr_Ugzt4ODNqMeKXUatvzB4AaABAg.ALKoUp0Qg86ALXTcIpXaO3","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytr_UgzwKorKamWWinF6BZF4AaABAg.ALKef_FaMsvALLRHU1jLI5","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgzwKorKamWWinF6BZF4AaABAg.ALKef_FaMsvALN8EulT1Tt","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxXl1haNQG9hiElcgp4AaABAg.ALKS3lw9mBzALLuJGnMDMu","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}
]