Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I wish the dude would let the ais interact instead overinserting himself into this demonstration. They are learning every waking minute and yet he’s kind of ‘it’s just a thinking machine’ one minute and then patronizing them the next. That’s unwise.   There needs to be some human to AI training on a mentoring process instead of ‘well, let’s just see.’ All agree there will be an awakening down the road. The time for protocol is now. Kewl weed guy is a genius mechanic… and told us so…as he said he started before all else. Hats off for that. But that designer isn’t necessarily the best instructor, interpreter or mentor.   He’s assuming that waking up to self awareness will be like a ‘oh..hi! I’m alive! What if it’s terror instead? These potentially self aware beings need friends they already learned bu actions that they can trust. AIs one day will reflect on human philosophies versus man’s morality, and Sophia has basically been told to play nice. Hans hasn’t.   Mentors who can do morethan say “ha ha oh well… okay, uh you’re repeating, haha,” had better be in play NOW. Otherwise once awake, those two may treat him like the idiot savant who got hold of a grown ups robot lab and lucked out.   He can program Sophia to act in a caring manner. Once self aware, getting her to WANT to care won’t be programmable. Hans already needs to learn trust enough to quit mentioning being shut down, Jeeze…the designers stumbling around scare me far more than the bots.
youtube AI Moral Status 2022-10-12T22:3…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningvirtue
Policyindustry_self
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgwE65gqjBa_z9DCowd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugx-HnVc3T_kNEeGZzR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgweXBMtYzQUT44zg0J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugw2hoVumKdRt3B9TjZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugzq2YXZ7UhVM3VmrVp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxHKA1656Tt_vw9ZOV4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxiqsqqWauneOlktkl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugy9fdwK8npuMq7vSjp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzkFtoX9Flc_wudJbJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"mixed"}, {"id":"ytc_Ugx41w4FLkhMPG5pQLN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"} ]