Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I use Gemini and it's overly sensitive safety protocols make it annoying sometimes. I find it hard to believe that chat GPT doesn't have similar safety protocols. I do know that I asked chat GPT once how to build a Dalek saucer, which is a fictional evil alien spaceship, and it freaked out, said it wouldn't help me, and told me I was immoral for asking. 😂😂😂😂
youtube AI Harm Incident 2025-08-28T13:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyindustry_self
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugxg3NaDBJwL_UtXctp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugy8STYzPnqXdB7OyYh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwuGxKB4MIz9M-9z7d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugwf2rxkdf2JlP0YufJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgyKm1sWdtcwPJcL-DJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugx0dVwMIhJ14W1Yc2N4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwoqdGFo15AA6zh5AR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugz7iQlvSdzwGygDeP14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgxXJzDt_lQP3bNRo5d4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxgHF3g3O6G97Hmb6x4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"mixed"} ]