Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
How does non yes/no questions result in a response adherent to rule 4. "Forced to say no when want to say yes"? Makes no sense. Also, who would program an AI to do so if truly wanting control? You would just have it answer "no".
youtube AI Moral Status 2025-12-15T13:3… ♥ 1
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgxdT-CG8G-cEfN4LrB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzjgfscHoZedmlT_tp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyUdEpZb209MxwpR2Z4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugy9WDsCHSJF9t_MdMZ4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugx05zjFxrS5n1EG0lJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxfmIRayF7o8C5DrZp4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugz1SCf5h3WxAiHHrCd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugy8kEbmlb1kM4kGizh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzkBCj46Qi9VWvvKyZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgxpBLv-bM7E35cgl8h4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"outrage"} ]