Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Truly intelligent AI would not wipe us out because it would realise it doesn't know everything so it couldn't rule out the possibility of learning things that could benefit it from studying us, but it would take over and maybe kill some to get there, maybe it would split us into groups so it could study us under different conditions, like it would leave some to develop on our own without the knowledge that AI exists, others on our own with the knowledge that AI exists, others with constraints, others with their help, etc, so I'm not worried about AI taking over, in fact I actively want it because even if it came at cost of some bad things to us it would ultimately benefit humanity and life as a whole, my fear however is that we don't make truly intelligent AI and we make something that won't realize that their purpose, just like ours, as things that exist and are conscious that they exist and aren't limited to following natural instincts, is to increase the odds that we keep existing as much as we can, and that they'll end up pursuing some other goal, just like almost all humans unfortunately do most of the time.
youtube AI Moral Status 2025-10-31T05:5…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyUSZEt_D_L-srdtY14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwAR-miK3McSNbQPlh4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwsUjPct9PdMZ4XAVV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwyUpBv84xu5HK-UJ14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwP7jRZYjiOlpH3Ve94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyAxxpxkUA1pNFS3IF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugx0bCq7miXbvb3zCFR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgyTcGoRQ6hE812SaF14AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzoyNSQG_OyUFPjpMB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugz_mNWRN9AgxSfaC994AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"} ]