Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
That AI didn’t do that out of its own volition. It was explicitly told to solve an impossible problem and to do it “at all costs”. Fcking researches clickbaiting
youtube 2024-12-15T20:2…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_UgymSciS9-4kOGe8DB94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"}, {"id":"ytc_Ugxvgirs5dDdgts0iKB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwH8vxiYbI7QZM52Nh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxDF3aqTeiSw2_j33l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugxe_EuwLuNMIEukIB94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw6YTviQ9iI91qN4s54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw2Fb4PoVBLBju5ufJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugwn95Sno3IE0-HJI354AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxG8e8KV2CqJHIHpDR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugzs6NGPlFBW8eGDs5p4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}]