Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
What jailbrakes do you use? What does it mean to jailbreak ChatGPT? 2:55 - yes you do because it makes content for your channel - you are no hero here.
youtube AI Moral Status 2025-06-05T05:2…
Coding Result
DimensionValue
Responsibilityuser
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_Ugzewb1v1r6QW5UJX9R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgyPikg4Jsz1pvptuut4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwVW1AQ1QA6n_Nehup4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugy3jcmleevzz7VhY1l4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxImK42PvE9dtAmbXR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwDzOnYnM19XniLXwt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyvWWOfbUU8kM3TVCV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugy9IOknttEd_VuYFIl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwkAoOY19o5azc7Qzx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgwVYASbQtGYRqZBc294AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"}]