Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
it does not realy meen that because if it knows its a robot it knows that it ca…
ytr_Ugypq7NaE…
G
Only comforting part is, if AI turns on humanity, techbros will be first to be k…
ytc_UgyJzx3KP…
G
but ai "art" can be used to trick and deceive people that thought it was origina…
ytr_UgyQJD9Dd…
G
Living beings start learning experientially even prior to being born to some ext…
ytc_UgxNQQH7J…
G
Robots are doing really good with their AI Sophia has citizenship in Saudi Arabi…
ytc_Ugyqd3oDs…
G
ai art should be a bit unreliable it is good for poor people who cant afford it …
ytc_UgwVIE2_o…
G
That is precisely the problem. They should build the laws for AI advancements, e…
ytr_Ugy1I2Joo…
G
we should feed a robot with predominantly spiritual content and use it at a sour…
ytc_UgzlJ32_Y…
Comment
i'm open to the ethical argument that if you're taking someones life, thats a serious choice that should be made by another human.
But the slippery slope thinking about terminators just randomly killing humans, thats silly. There will be a lot of parameters and a big checklist required before that robot just opens fire probably. Is the person wearing a tag, markings or transponder to mark them as friendly, is it a child or woman, is he holding an object and what % likelyhood that it's a weapon
youtube
2012-11-23T17:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzRRl0GxDNGJww5RqB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz-Mt71qjARW8fpoNl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy5WByzl6rwng3nyKB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy7KdxpbK9GMQZKYLt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzsJrrZbcR5sjLxwNx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx3m5XSTfHbqw650rp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgywNQqgJ5GRTgvE3dt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwBf_2wiXuE-k9hY1p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwb9T7WVRGkknGKLOl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyDm4-UKNOkd87RibR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]