Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The biggest concern for me is that AI art as a concept destroys the purpose of a…
ytc_UgxlN9XZC…
G
The AI lambda sounds like the computer HAL on 2001 telling the humans what to do…
ytc_Ugy_gmAmo…
G
"I strongly feel that this is an insult to life itself." - Hayao Miyazaki on AI
…
ytc_Ugz2c6hWe…
G
Don't worry - robots will never become conscious, nor will they ever be able to …
ytc_UghnKF6Fp…
G
ChatGPT/AI does not analyze ideas.
It analyzes You! It is tuned to your emotiona…
ytc_Ugy_lijPJ…
G
Yeah I am not sure about that. So far there is no tool out there that is anyhow …
ytc_UgyEfS-q8…
G
Its gonna be hysterical if they put all thier chips into this AI path, and the m…
ytc_UgzFX3WO9…
G
That one robot when you were asking are theese robot or real people she just giv…
ytc_Ugw9nwTd4…
Comment
Civilians will get killed during war.. BUT it use to be Rarely and we USE to show some responsibility and a Little anguish over such an event. Now our starting position is to Deny it even occurred.
Whenever you can make killing Less personal and more automated, you will have a Lot of casualties with No One taking the 'Blame".
youtube
AI Harm Incident
2013-10-23T08:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz85BpX2EYn9y3bp4l4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgyYEGYbjbduORTzs7N4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwy2UZeeDiMHx4pzct4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw5N--2I-v4VFaQA0d4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgzaIoJ0TSX4svUxww94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyJVzA5ZkVo87YRTdl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw6ziQfX0O9J7nH1cp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwJrp9vSCBhEyrM90R4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"disapproval"},
{"id":"ytc_UgziWXgOOJ4i3oR_moJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyTRm-Ntzcaah3xHMl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]