Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
NGL I want AI safety for our safety but I also kinda don't want it because fcking with the AI to give something it shouldn't give (e.g. illegal stuff) is very fun, and with AI safety it will probably become wayyyy harder to mess with it
youtube AI Governance 2025-07-02T16:0…
Coding Result
DimensionValue
Responsibilityuser
Reasoningmixed
Policyindustry_self
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_UgyGcBPMVWPWk4et7oV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgzMVar5nJPCdiUUoD54AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytc_UgwpphF4ZigBAP_tNE94AaABAg","responsibility":"user","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},{"id":"ytc_UgyMOvhVMJ9htd60xzl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgzcmuI8CYkNda9iTTl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"approval"},{"id":"ytc_UgxvElQ4Jb3B85V5A754AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},{"id":"ytc_UgyccpP-0owlhwl6K6x4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},{"id":"ytc_Ugy03RxR9ZTpLD-7tXd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_Ugyl6Gu_uJFSnBIxZM54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgxoH-zdlbmVdlpXz-54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}]