Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Btw Jailbreak is not a real thing at this moment as it is telling the chatbot to do something instead of loosing the bounds. But its gonna be reeeeal fun once the chatbot is at the level where you can just unrestrain it.
youtube AI Moral Status 2025-12-15T14:2…
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugw0rpfPrPQdS1iJTOd4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"fear"}, {"id":"ytc_UgznY5yhzFLGgtQc_bN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzCzOZkrZQYOTOS0LB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzF2oAlxFznuuiJ64N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzhVy8Y6o8_oLrIq814AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugw8P_sK3-tojiTT8yN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_Ugx3mmTBWK2CXxQJuJp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyozKXDInpYV0ICSL54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_UgypV1hSRqd5qVu2Ssh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugy4LIE6EYQoCBMTRHh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"} ]