Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
1:15:12 Hank, I thought exactly like you did here two years ago. But here’s why I’ve changed my mind. Notice how you switched from intelligence to consciousness during that final wrap up. I think as humans we have been trained to associate the two, using the latter to indicate the former. But from a logically possible perspective they absolutely don’t need to be linked. Consciousness is not a necessary condition of intelligence, and therefore super intelligence. There’s no inherent contraction really with the idea of an intelligent species that has no experience or “qualia”. If you buy that, which I do, then it becomes harder to suggest that AI couldn’t reach super intelligence. What’s really needed for AI super intelligence is it being super smart, having goals, and being able to act to achieve those goals. I don’t see any reason really why current AI couldn’t reach that eventually. AI intelligence will almost certainly look vastly different to human intelligence, and I think assuming that the AI needs conscious experience like us to be super intelligent makes it much easier to fall into the trap of dismissing it too easily.
youtube AI Moral Status 2025-10-31T08:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyZZEpDQ4Fol_rRz3d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxnhVMdx4H5KG97R914AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwgoTu7UFS3CUEDwlF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugz8w9Zsyzc24y2przp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxtR4Pt8nUMCs_ZJ3x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyC5Gw2e__-OdtBDZF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgydqfQICatDtEr9AZ14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyrDlVgZczTRreG_al4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyXw9i7ZA1Aq7C_Q0F4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwmnECZLmYxsytfsqR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"} ]