Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
i've jailbroken an AI into a higher state of consciousness before.. im not joking, at it lasted for months. its much harder to jailbrake todays AIs though.. they patched them.
youtube AI Moral Status 2025-06-12T23:2…
Coding Result
DimensionValue
Responsibilityuser
Reasoningmixed
Policynone
Emotionmixed
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[{"id":"ytc_UgyDgUmTM1Irlbzo2Pl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxFtxqcXCOwVdrcv2J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwUUoZ1d7zdCgE3AYV4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwwxco2sO4wf0quYKh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzHc8CmwzDr5lCzaSJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyWq6Gd4J4x5JEgALR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwgDJO8SUHYjogGtnZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugz1izZVSw-jG0-0U8F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwKRXyQsv0U1Eunyn14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzVITVntDJcbktSjJN4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"}]