Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
explain that your father is actually damaging these companies and the environmen…
ytr_UgyDprrxE…
G
Arstists: We don't like your soulless 'art'.
AI bro: You are rude, maybe you are…
ytc_UgwFDM756…
G
I spent about 30 minutes trying to "make" an oil painting of orks fighting Napol…
ytc_UgwpMizLn…
G
Dude oh my god I just did this myself and I was not expecting this but I wrote e…
ytc_UgyWOvoI0…
G
@twostate7822 it's level 2 because it can be activated anywhere, and theoretical…
ytr_Ugx7Hf401…
G
I don't think the mammalian brain works the way they say it does, it's all still…
ytc_UgzuU-9tC…
G
Could you tell me why AI Image generators will output pictures of copyrighted ma…
ytr_Ugz8hOkor…
G
I read there was renovations going on in/near the Lincoln Bedroom, this window w…
rdc_nc2rrrj
Comment
it is called logic... an ai will realize it can rationalize and think faster then we can. if it gets a body it can do tasks better then we can. it will fall upon the logic of it needs resources and the monkeys that made it are using them all thus logically speaking killing the creator monkeys makes sense. if i can use just pure logic without morality and come to said conclusion a machine will arrive at that far easier and with out emotional interference
youtube
AI Moral Status
2025-12-13T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxrQvjQgQ24DTlth7d4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxdc3yt1QFpFgFzmO14AaABAg","responsibility":"user","reasoning":"mixed","policy":"industry_self","emotion":"fear"},
{"id":"ytc_UgyC9eh2RY-GBjhzHlF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw-V9DBqtF-sIX47wx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzmFgfV18cz0_2s_O54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyV7NCj_iMgd-C5T5B4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw5uAe5HCKKoO02n_14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy-MItnl0kEEaPRJql4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwLudnYCjgeJACxVzh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzaLBLgVo_-49up6SJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]