Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
AI has become several orders of magnitude more energy efficient just in the past 2 years. We aren't expected to hit an energy bottleneck for training AI systems until about 2030, and we're expected to have AI that can do AI research before then. Once AI systems can autonomously train their successors, we are well and truly cooked. I recommend joining PauseAI or another such organization to push to slow things down now, rather than resting on the hope that the exponential curve will stop being so exponential very soon. (Like it was supposed to do last year, supposedly.)
youtube AI Moral Status 2025-04-27T04:3… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_Ugxo2GjUyERtyIV9xA14AaABAg.AHORAZC0DOLAHQQtQNdBpB","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_Ugwmrb9Qi9ECXEUn8sJ4AaABAg.AHOOeUxaGMIAHPT2Ppj-BH","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_Ugwmrb9Qi9ECXEUn8sJ4AaABAg.AHOOeUxaGMIAHQZVsu6sZu","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugwmrb9Qi9ECXEUn8sJ4AaABAg.AHOOeUxaGMIAHUZ97MRZWl","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytr_UgzqvP_89QFiSZeh0NN4AaABAg.AHOLyeu0h1lAHOR6mKPKOh","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzqvP_89QFiSZeh0NN4AaABAg.AHOLyeu0h1lAHOTM_eqCs7","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzqvP_89QFiSZeh0NN4AaABAg.AHOLyeu0h1lAHOzGIUf6B2","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgzqvP_89QFiSZeh0NN4AaABAg.AHOLyeu0h1lAHP-2c4TwGz","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_Ugxj5C5mPY4raUGzyW94AaABAg.AHOKYrbTpAmAHQB9eajQfs","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_Ugxj5C5mPY4raUGzyW94AaABAg.AHOKYrbTpAmAHQFM9hvrDb","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"} ]