Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
You should definitely pay more attention to the AI doomsayers. That way you won't say things like there's no danger beyond what a human uses an AI to do when it isn't sentient. There are tons of examples of current AI alignment problems causing massive issues. This is because a non-sentient AI is usually just a good optimizer, and optimizing for something well means trading off resources that it hasn't been told are important. In fact, you should check out Robert Miles's videos if you want to see examples of this happening and the reason that it happens.
youtube AI Governance 2025-08-24T21:0… ♥ 2
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytr_UgwFxGOD0VzdsjR4r0R4AaABAg.9YzujpEfIPS9YzyEOgRd05","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugy4UJB489_9BnoymGd4AaABAg.AM9SFMDxJMpAMMTcm3Rtso","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytr_UgzcI2zdfWJ4NpTHw7x4AaABAg.AMFXI42tL0aAMFXdFgVeOh","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzAjO6KWMde1f8FY0N4AaABAg.AMEf_cuJfeNAMKQEiPrR6C","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytr_UgyA65uBpwx0OPX9NdN4AaABAg.AMD7ODg-VOZAMDe4ywoLGl","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_Ugz-C5NpvmbiuPMLaWt4AaABAg.AMCptAHY4PnAMXJIbTsen_","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytr_Ugz71U6yLBz5XwYL5rh4AaABAg.AMCg3FjUkO6AMGWTMashoR","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytr_Ugz71U6yLBz5XwYL5rh4AaABAg.AMCg3FjUkO6AMSFUZ1XE6N","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugw6xYBBLH8IwOI1teF4AaABAg.AMBiB8mEe9aAMD23PZBfB3","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"ytr_UgyjbFjVx6bScgzZlSJ4AaABAg.AMAVoTciG9tAMAlREF5Zuf","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]