Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
“Pentagon Defends Child-Abusers While Attacking Some Vague Attempt At Morality I…
rdc_o78ugi6
G
@bandito5434 Thank you for your "interesting" comment! Believe it or not, it's j…
ytr_UgwPTiiXp…
G
as a colourblind artist with a movement disorder, ai "art" could never benefit m…
ytc_Ugw6Q_JLW…
G
That's just the next logical step and one of the motives behind wars. For if you…
rdc_nxpso1j
G
Look AI has a place stop pretending it doesn’t if you don’t put it in its place …
ytc_Ugw5k33bN…
G
Ai steals art, artists trace ai art best solution in my opinion, ai isnt real an…
ytc_UgzPYimLh…
G
Also, AI is the cure to poverty. I think Stephen Hawking mentioned that one time…
ytc_UgzHqaHlg…
G
Ai should be banned we've got by for so long with out it it's going to cause mor…
ytc_UgypLYjZF…
Comment
A robot may not injure a human or, through inaction, allow a human to come to harm.
A robot must obey orders given to it by a human, except when such orders would conflict with the First Law.
A robot must protect its own existence, as long as such protection does not conflict with the First or Second Law.
Should have payed attention, AI is way too far along with no real safety implemented.
youtube
AI Harm Incident
2025-07-23T21:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxRZEd2vSbZHqDLz2l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwZu4CZr84MUZLCG5V4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy_jHCUbAYBOEzzASp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx7zl-aUAs_FfPyf1t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy2j41VU2PcxMXA3il4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxfMUxc0xd00HoltX14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzr-rECvpPZNHCtQod4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgypWaUkG2CxVC7dib54AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyOWvLo54pXmK2bWL14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz7Ker4IpncoiFIc7F4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"}
]