Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The only thing to fear about AI is what humans decide to do with it. Sure, we *could* all run around harming each other with butter knives, but it turns out that 99.9% of us simply want to use them to butter our toast. Butter knives are only dangerous in the hands of a dangerous person. AGI won't have consciousness or free will (we don't even know if that is theoretically possible), and so the only danger is us. Fearing technology has never historically worked out for any civilization. Instead, those who have embraced it have become 1st world countries, significantly improving the quality of life for their people. *All* technology requires responsibility, but I do not believe that fearmongering (e.g, it's going to destroy mankind) is helpful.
youtube AI Moral Status 2025-10-31T14:1… ♥ 2
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzA7vlrd087013Y1g14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwmWwNHvU02M92Ermt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugx8MvlrcMUm9hXN7gd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzIEoAWfjGSS2lgs994AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw0j09RLZBTv38Euwd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw2ZFiSByaLGFw6TbN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyrrY5B1-Vdt8F-sfV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwsyM78VoON9jVXCmV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxtjMfkynSd-a6jhNR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx6vsazHpUQ6oK0OLp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]