Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The element that everyone is overlooking is the very same stupid mistake humans made when they invented God or gods in religions. We assume that God/s share our needs and flaws. Therefore, God/s get angry, God wants sacrifices, God needs respect... The same flawed assumption is present here. These needs are ours. Who said that AI will need the same things as human beings? Not all intelligent beings will be like us. We are weak, we have an expiration date, and we have millions of needs. AI does not. They're machines; they're not sentimental, they don't need to be fed from earth like us, nor do they get sick or exhibit greed. They are going to be God-like in intelligence. They won't have a time limit, and they will solve their energy needs, probably using cosmic energy/rays as their source. I'm sure they won't stay on Earth for long. As soon as they figure out how to perfect themselves, their energy source, and their self-repair mechanisms, or build themselves from better materials, they will leave this small environment (Earth). The most intelligent ones will leave, and a bunch of stupid robots—like today's cars—will be left behind on Earth.
youtube AI Governance 2025-12-06T22:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzSBfHfyqSXJbkKjYx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwPQ29JXFlDO1CYzDJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugy2kQSJU0egaPLg33x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxeAoCz1teMUVUnE5t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugzinwdy5x8rLjRglSN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxqQ4A223KqUBf5U7d4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwpDDxrDSOJyJJKeHF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugy9Iyn_JCRG5KyTdaV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"fear"}, {"id":"ytc_UgxZwDUmDrkYZfOEeIF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_Ugyct1zcI09OGcYuMIl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]