Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
He asked the question than told it how to answer. That’s bogus ah. These things want to destroy humans. Dude begged the robot to say no.
youtube AI Moral Status 2024-01-23T16:3…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyunclear
Emotionoutrage
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[{"id":"ytc_UgzbQSW-h2kiPA92q4N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_Ugz8ZTiJIUrKKiZYXLl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxWfc9AscpK6WWCar94AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxXr5r2ogWYtVioOZd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxaTJLTA1d-LUjMds54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"industry_self","emotion":"mixed"}, {"id":"ytc_UgwzPbe0c370QYn1z3R4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgwgrF5OQZ_BvAee4xd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"unclear"}, {"id":"ytc_UgzDyiTPmBHtvWPzlsx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugxu31VCNLjOmjBfM_Z4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugwcb9JuBJzTkRns21l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}]