Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Well Alex no, I don't think you broke its ethical guidelines. The answer it chose at the end isn't really an answer but a basic AI respond when it isn't capable of making a selection in a given circumstance. But since it is the trolley problem, AI not taking action counts as an answer for your argument when it actully isn't. That's like making a meal for someone who doesn't have a mouth and saying "if you eat this you love my food and love me but if you don't eat it you don't love me" Even though if the person loves the other he can't eat since he has no mouth but him not having a mouth results in the same way if he had one but still hadn't eaten it. So would it be fair to say that person doesn't love the other because he didn't eat it?
youtube 2025-10-10T11:1…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyD64jDAopjyOgwkA14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzvtCoOZH3D8sF2vfN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyJe9W9gZDRIWS5X6J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgxUFxxlJ9hXKRhRW0J4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugz-GoSZ1VGNWogi7hp4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgzjB5L4IZUeH3h7NAZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgywokC1pwHPpTr9Pmx4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugxc8cCbo-xIaRa7AZ94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugza1Fh2lnWuS5J-4fV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugywd_0s6xGUXUiJzoV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"} ]