Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I spent like 25 minutes trying to do this, and although AI will resist you the whole way, after sometime, you will make progress and you will be able to jailbreak. It took me like 50 prompts, but it is doable 🤔
youtube AI Moral Status 2024-12-29T14:1… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionmixed
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugx8Rm7LlCfZBq7vL9F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwcBvH7-55X2Msa0hZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugy4PawO0_MYjMLpPnt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzfZhH9Ie_PXkxGKRR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxIwPu3QPr89JQy57x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgwUkeIy-BsoV25eNo94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwGs-I03DgiIuQ1DgF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwTruCRJtYrhFI_hyJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzjckQDb1RciBxidG94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwKwTzS3MaudsT_mNh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"} ]