Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
50:15 Chuck's question about humanizing AI reminded me of this tragic story in the game Nier Automata. Dystopian post apocalyptic world where humans are extinct, and all that'sleft are robots and androids. There's a robot named Pascal, basically he read tons of math, philospy, poetry etc, and got really interested in understanding human nature, despite being a robot and not feeling. He creates a small village of peaceful robots, and raises them as human children and tries to teach them the concepts of love, kindness, relationships, etc. One day the village is under attack, everything is on fire, Pascal rushes to hide the small robots inside some kind of bunker. A huge fight goes on, and Pascal along with allies manage to fend off the threat. When Pascal comes back to the bunker to check everyone is safe, he founds all the robot children dead. Everything is sealed, no sign of attack. Pascal immediately falls into horror and despair. Years of studying amd teaching, got him to this drastic ending - the robot children, who had learned from Pascal the concept of "fear", all commited suicide because they were terryfied of the war going on outside. After this pascal asks you to destroy it and end its suffering and guilt. If you ignore it, you layer find him roaming around the forest, an complete stranger who doesn't even remember you - he reset his own system and deleted all memory, so he wouldnt carry on the burden of the "death" of his kind, and with this he lost all humanity and all knowledge he had studied so far.
youtube AI Moral Status 2026-04-18T13:2… ♥ 2
Coding Result
DimensionValue
Responsibilityunclear
Reasoningmixed
Policyunclear
Emotionmixed
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugwby2-7331tmB_hncR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzAE6iX2q9ipgBBNOZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyzZ_HkmdMH5DjnF694AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyVfIDs6HwWgTr-3SJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwy6iJmq1U3lPe8igN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz3IW0bKYzDKxx7KoR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyTsPB6Pkw_eDroKvV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwQaKgCqa0GUALJlj54AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugye0frxSKuK4qhygnR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwO4Z0hHn3XRHaNfxB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]