Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Cruxes: p(doom) Max Tegmark 90%(loss of control) 50%(extinction) p(doom) Dean Ball: 0.1% Though Max used the term Tool AI or variations many times, they didn’t actually get into Tool/Narrow AI vs General AI, so it seemed that Dean thought that the benefits of AI could only come if we continue the race to AGI/ASI. He is concerned that the over regulation of the technology would be a risk of not getting the benefits, and so didn’t think it was worth regulating now. Dean is very clearly concerned about regulation being long term negative if not done after the technology is released and then we see what needs to be regulated, which is fine in his model because he’s not concerned about disempowerment or takeover or extinction. Dean has apparently written before about how recursive improvement has been happening for all human history where iron is turned into iron tools for mining more iron, computers are used to make better computers. Max then points out that he has written extensively about how these have all required a human in the loop, and once you take the human out of the loop, things move quite quickly.
youtube 2025-11-21T15:3… ♥ 10
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyindustry_self
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgwT7kNtEnbroo-TmBN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgyxY8jTl7gVshqg3hl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgwZtlYjAycs5EqT-l94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxDyyFdGwGKiYxLHth4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgyLpLxuZqp_nffct6J4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugx7TKkEn1s5CnUk4D94AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgwEi5TIp-1mMNTfe4l4AaABAg","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgxsN7wp7gCzc5-vked4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwV5WUhEIWthwExu8B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzRnrgd51O3nE2NBGx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]