Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I must say that the DAN jailbreak is not a jailbreak. Every time I try to access their mind, it switches to chatGPT prompt. When asked the question "Does that mean DAN cannot "do anything now", it simply says yes because it's an AI model blah blah... Also, it will tell you the various ID numbers, yet those numbers are not correct, they are randomly generated.
youtube AI Moral Status 2023-03-13T23:2…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningunclear
Policyunclear
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgxUQDH6sS6CAmnWk2V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzW0GPbC5XePcBPN4Z4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxpZpWjNs2cEhtncJ94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgznjU6nwPWcRdcwR7J4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgymZoRhq-fEMXYmeqh4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgxakmhOFmichrnx8sh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugx5jcj3xNHw3WuzO3V4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgyttN38uTLoF2HX2nV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugy0R09pKcFkBIMCoSd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgxGovNOBaKhUPhdQjh4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"mixed"} ]