Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I’ve done a bit of research into the topic of AI superintelligence thanks to doomscrolling. I’m no expert by any means, but I just…can’t fully agree with Yudkowsky here He’s right to be concerned, the tech companies making this stuff have shown time and time again that they don’t give a rat’s ass how much harm their tech causes, especially Sam Altman and Elon Musk, but I think Yudkowsky’s view is pretty flawed in its own ways too From the way he writes and talks about superintelligence, it sounds like he expects it to just pop up one day-suddenly and without any prior warning. That’s just not how it works. He said in a tweet that his arguments don’t rely on an intelligence explosion/FOOM but it’s really hard to accept his arguments without it. If the development of superintelligence is slow, it means that we have plenty of time to catch these AIs in the act and take precautions. Even in the takeover scenario they provide in their book (to their credit, they did say this was almost certainly how things were gonna play out) its hard to see how an AI could do things like make bioweapons factories, hack crypto banks and sabotage the competition without people noticing (the scenario says that the AI doing these things is not superintelligent and is trying to manipulate humans into making it superintelligent) So while I do agree with the overall premise-we shouldn’t try to make superintelligence so soon without proper regulation and safety research. I can’t see how superintelligence will suddenly spawn in overnight like they suggest it will
youtube AI Moral Status 2025-12-06T20:4…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyliability
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxQtfQccEd6wNZMJod4AaABAg","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgzC3hjBhUyU0PlGd2B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugxji58LJykrzd0KVip4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzzgAGML7mk2Tgao9R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwLp1OM9DGWXQvgxCR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwU9XaDkAC4DPouC4d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugyg6EFuaZ7tjPIrg5d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxmLpVDOqoFYB2V6h94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_Ugxz9pT9Iu8JZlGhd354AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwTQ2SHbyUoWMyXRtl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]