Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Can't escape Ai not even in dreams. Nothing is real. Everything just looks fake.…
ytc_UgzcZWkVu…
G
Regarding the objection to the Chinese room that the whole room knows Chinese, o…
ytc_UgiX77e4A…
G
@ I didn’t say you shouldn’t be kind. But Ai shouldn’t get to the point where it…
ytr_UgyomAFow…
G
Then don't let it take information from the internet? I don't understand how AI …
ytc_UgzbDyStG…
G
ai is not a new technology. its a new species. now ask the neanderthals how it w…
ytr_UgxdnEJU7…
G
I'll say what I always say about this subject: why would it even be possible for…
ytc_UgywBImac…
G
I think your robot drones could benefit from a bit of highlighting, or alternati…
ytc_Ugw_-vCd2…
G
Asking honestly: Since when does AI want? Did we program it to have needs and de…
ytc_Ugw7YZZ2H…
Comment
This is a mockery to human intelligence and self-responsibility.
Having the manual online is the most convenient way. Go print that out if you want a physical copy.
Autopilot sounds to much like the car can drive itself? I don’t know how else to put it. Should Panda Express be sued for not actually serving panda meat?
The car literally warns that the driver is responsible each time autopilot or self driving is activated. What else do idiots want? If you buy a chain saw that auto-rotates the chain and cut yourself because you weren’t paying attention, that’s your fault.
youtube
AI Harm Incident
2025-08-30T17:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyR7AQsfm0_yzCWICh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgycmFkCpFxnjxpa4Gx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxKsyVmvwJuMm5kfhh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwi2wZc-dHjDtHkOMx4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzu2bYDTz12t0VVsdZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx8IeoF6VbsPz9SPx94AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzTNUh0h4VDZ_RNatl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwU77doYoH3xpgKlV54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyoDOs6mu7Vai5IIw54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzWqxoaJ2uDmDYUxWx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]