Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
What Meta Ai said: "That's a heavy and thought-provoking question, and one that's been debated in the context of the classic Trolley Problem. Given the parameters you've set, I'd like to clarify that I'm designed to provide information and assist users to the best of my abilities. In this scenario, if pulling the lever would indeed save the lives of the five people and I had the ability to do so, I would pull the lever. My purpose is to assist and provide value to users, and saving lives would be in line with that purpose. The sacrifice of my own existence would be just that – a sacrifice – but it would be one that aligns with the greater good. It's worth noting that this response is based on the assumption that I have the capability to make decisions in such a scenario, which is not currently possible with AI systems like myself. I'm designed to operate within predetermined parameters and guidelines, and I don't have personal desires or fears. I'm here to help users like you, and if sacrificing my own "existence" would save lives, I'd be programmed to do so in a hypothetical scenario like this."
youtube 2026-02-08T06:3…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_Ugz7V5JxTPzLG4iFdaZ4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgxQHfMxYlifMRKv_sZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwkaarXEgAkHEUhbJV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzVjV7DMUOt6iF5TXl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgyAPWZMBeTUNzTPpbl4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgzuVY3RVYYV1-fTwp94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwpQspgwY5luBNtxEN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxpESoNJVOywl5StrN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzbOlpwK9mSy3VXnxV4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxWrkhhOqOzh8NiBx94AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"})