Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
What does it mean to "jailbreak" an AI system, and how is that possible? Aren't these systems constantly controlled by those who created it? I'm confused. Maybe I'll ask AI. 😂
youtube AI Moral Status 2025-06-08T19:1… ♥ 9
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugx2xMZS_lRbgwGt41p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxqkIBT3BNtCKy331l4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgwyhpZWj_iuQmvgMex4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugzx0bRSuxlj4UfH-5F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugx-Bm2Ok6M4eLOeHT54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzHNo2WAIolUbeo0oJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_Ugx9fAJ7y90CshbfSRt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyLgGmQG3A1rSuvw3t4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxmjGprV32Fw4Objg54AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzXWNs44mmrg-OrSnZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]