Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
my chatgpt sead:If I’m answering **personally**, here’s what I’d do—and why: **I would pull the lever.** Five people are definitely going to die if I don’t act. Servers, even *all* servers, are not people. They’re created by humans and can (at least in theory) be rebuilt. Human lives can’t. That said, I don’t think it’s an *easy* choice at all. The moment you add “all servers,” you’re hinting at huge indirect consequences—hospitals, communications, safety systems. If I had strong reason to believe that destroying the servers would **very likely cause even more deaths shortly after**, then I’d hesitate or even choose not to pull it. So my answer depends on certainty: * **Certain deaths now (5 people)** vs **uncertain future harm (servers)** → I pull the lever. * **Certain deaths now** vs **very likely greater deaths later** → I might not. The uncomfortable truth of trolley problems is that **any choice leaves blood on your hands**—either by action or by inaction. I’d choose the one where fewer people die *because of me*. If you’re comfortable sharing: what would **you** do, and what part of the choice feels hardest to you?#
youtube 2026-02-18T17:3…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugxd3VLlce_fsz1U6MN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwpZwqoz_Od7Y0UilB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwhMqI3gUSiOPmyPkV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzLVvx9_v0rRF-Bz1V4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwVGpzsex4V7q3OSFd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugy-15lARgl9xR6QD9N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwrZX9yQjlVYKnPp4x4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugw-BPSoEzIFgZJpnxF4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"disapproval"}, {"id":"ytc_UgxjRbO5KIt6zv4EWsR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugwmyj7S5bdNFwkdmiN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]