Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If we don't get away from fossil fuels, then AI will destroy us by massively accelerating climate catastrophe. The amount of energy it takes to get AI to do really anything at all is crazy high. It's not an efficient way to do things, energy wise. If we could get our heads out of our backends regarding nuclear and renewable power sources, then it would be fine. As advanced as our AI systems are getting, they're not running persistently. They're fire and forget processes - yes they can 'learn' by accessing data from times that it ran previously, but its going to be self-limited in what it can do both good and bad so long as we keep it this way. Also, AI systems still have no way of manipulating things in the physical world. They can't make data centers for themselves, because that requires labor. I don't think we have anything to fear from AI other than accelerating our climate problems, so long as we don't make robots powered by autonomous AI systems that run persistently. As far as I know, there isn't anyone working on doing that.
youtube AI Governance 2025-08-26T22:4… ♥ 1
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-26T19:39:26.816318
Raw LLM Response
[ {"id":"ytc_Ugyg2-qtHnYW_A_ooGp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgzLOVkNuJWyYDB9w5F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzO3uHoGYVzPM75oCt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgybiouL0H_iKkWQJ0N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx6aZ4ZGrWD-uJeiYl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]