Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
6:55 no that's pretty accurate. Because you're not doing any of the work, becau…
ytc_Ugx9v_BAr…
G
I suck at art, that's why I practise really hard. I've gotten much better at som…
ytc_UgwG4rbBv…
G
Sorry the technology is good and improving but you aren’t really showing anythin…
ytc_UgxxAIEot…
G
I’m not a software engineer but AI has been a massive help to me in writing regu…
ytc_UgyZQZ6Yw…
G
AI does not think.
It calculates based upon predefined data and mechanisms. To…
ytc_Ugx0a7O5E…
G
Tech companies are the best paying companies out their. AI can only be created b…
ytr_UgzHMnmwH…
G
If nobody else noticed, there are multiple companies working on the hardware sid…
ytc_UgxuelxRw…
G
When programmers are now saying they ain't got a glue on how Ai is making autono…
ytc_UgwKFpkk6…
Comment
A category of (computerized) Lethal Autonomous Weapons Systems you missed that have been deployed since the '80s -- close in defense systems. Whether the original Navy CIWS or the Trophy system on armored vehicles, these are systems that *can't* perform their functions with a human in the loop due to the reaction times required. We seem comfortable with those, perhaps because their intent is "defensive" and the targets are theoretically unmanned / munitions, but they still operate in a space where they can be used, intentionally or inadvertently, against human targets -- an example being an Israeli Merkava that mistakenly identified the exhaust of a wing vehicle as a threat and had its Trophy system engage that vehicle, luckily with no casualties.
youtube
2024-06-30T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyXZtWEnlG4k9jbwYp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwP1OwyP7mVhYzoa2h4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxfuYLatYkkBJEn0fR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgxMxTLEmHzRR6FvvXh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxA1b4CeaY4WtvRv3l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxzpyp86eo9Xrx2NU54AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzzzMpDOAoLQYExmbd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwSEcJNrOzvHJJInch4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"approval"},
{"id":"ytc_UgyjiCvaelqCw9lhhaF4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzGUlumyIbMtQOt7O94AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]