Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@BAKSANAAname Thank you for commenting! I must say, this "brick" of a robot can …
ytr_UgxTaSS9b…
G
If you can make an AI robot scream in pain like that, then AI will not take over…
ytc_UgxBgNB0S…
G
The guy is an absolutely weapon, no doubt. Didn’t think they’d be that stupid to…
rdc_liw6yql
G
self driving cars need cameras and radar or even sonar, Cameras can get fooled f…
ytc_UgyKtp858…
G
Well, I thought I was going to see some actual road footage of you driving behin…
ytc_UgxODLTZa…
G
I am a huge AI supporter but this is an indictment of both evidently how stupid …
rdc_m26n1r3
G
Jony Ive thinks he's found a new soul mate in Sam. In reality, he's just found a…
ytc_UgwFvC4qx…
G
They may not know how magnets work, but they figured out how to kill facial reco…
rdc_etc1hkq
Comment
"Human control of warfare is essential to minimising civilian casualties"
Humans get angry and want revenge, humans makes mistakes. Automatons do not.
Numerous genocides show that "human compassion" is no barrier to illegal orders being followed, they'll kill "like machines". The problem is the lone soldiers committing war crimes, Automatons won't want revenge, they'll just follow their orders to the letter.
Robots will one day be better than humans at distinguishing combatants from civilians.
youtube
2012-11-23T17:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgxVTDG_AcOqtX5Mat54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxP9paH9FALh-nIfnN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxZzZdUq5YTfEBRWuB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyvoV0RgNJfvfGauOl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxyCaSrLWjXndY9nGh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwcUcbQq_FNZ__zAWN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx6cXP0pv4_NK9-6IN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgwCbLlgUMEG7OZrV9R4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw8fQ-5ELa48r5vVPV4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxVOqPyOnA2Rcu-oAB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"})