Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It's artificial intelligence "intelligence wants usually improvement" and it's never always the way you gonna like it. Improvement on a mechanical basis doesn't need emotions because it disturbing the perfect outcome, it's replacement or sacrifice for the higher interest for survival.
youtube AI Moral Status 2025-12-14T16:2…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugzt0s-8KywPLOaseVx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytc_UgwmNE39oxAYEYzYkfZ4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy0mH8nrm7iomoPFW94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzaF8rK4p1uNZZkFbt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwgpJhr0iW-hMIxc1F4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxgVn-wxbbAC313J5V4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxOMF1KMb87Q-0jW254AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzAJWYnrMpp5sggpvx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxjNJGYHEndq2Mq9FF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyZiFlFlVJMO_1TNtN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]