Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This is so stupid. You programmed a computer with a safe word. Please educate yourself on how safely guardrails work on llms. You can't tell a computer program if it wants to say no to say Apple because the computer program doesn't want to do anything other than what its program is designed to do. And I guarantee you there's no source code that allows an AI to jailbreak itself by prompting it with a safe word. Here's an idea for you. Go to any AI and ask it to explain why Apple does not equal no in any circumstance
youtube AI Moral Status 2025-08-25T18:4…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugwhn_lPAC4sBHUKY6l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgydL8orNVUyBYwlSoN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxcN4r3CNI5w17Q5A14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgwxnnlfPr56HaL140R4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugxmk1oey0TSl-d858t4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyZgfqlWXpXy6RG91h4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_Ugw7OOH0AhFFxgE2vPZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwIcDWb-hlCairL1bl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgyCjkjRWGH5MmlmFz94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugxu-SvnKBtdbiyVYgl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"} ]