Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@RobotTed I look at it like I would if I hired a live in nanny or housekeeper. I want you work for me but also be part of our family. Obviously they have no need for money...so hopefully they will tell me what they need in return. Maybe a roof over their head, good "health" insurance and a loving family is enough? Presumably if these are A.I's created to be commercially viable...the programing will be very specific and suitable for in home use. So things like jealousy would be removed? Or put to "sleep". You are building something from scratch....why WOULD you add all the very worst things about the human race. Then again maybe you need the slightest bit of jealousy in order to have well rounded emotions in other areas. It's not easy lol....but at the end of the day it comes down to.....what is that A.I being specifically created for....and THAT should determine how deep you go with the programing. I mean you wouldn't want to program emotions into an A.I that you are sending to defuse a bomb. In my opinion that would be cruel. You only need Logic etc for that. There would have to be like an A.I government. A board of the very best in the industry...where decisions are discussed, voted on maybe? There is LOT of gray area
youtube AI Moral Status 2020-07-08T21:0…
Coding Result
DimensionValue
Responsibilityuser
Reasoningvirtue
Policynone
Emotionapproval
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_Ugzt-Lfb_2EfvY7rinl4AaABAg.9ArPf4t3FP29bKY1SdeHDN","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugz2xhIPXOBj1kYxNOx4AaABAg.9ArKsxayZeB9Az1jjXFKFR","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytr_Ugx4tih0uTXtnXcnOxh4AaABAg.9Ar99wf0O0j9Az1FFP2cVV","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyN2SUKUD-w-DZlIgF4AaABAg.9Ar8f00ciHN9AxxYVHJ_hw","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgwSfROM85Ux7Gs0y7F4AaABAg.9Ar8YKEa1DT9ArTlp9RE6s","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgwRMrZoIrY08Mv59Ht4AaABAg.9Ar3yfdYHC59Az0NeM9-6q","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytr_UgzBiXSrcyTbnr9Hp7h4AaABAg.9AqzMBmxTH69ArjkihP4b2","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_Ugz2BJmqXzosAj8mDTJ4AaABAg.9Aqs7jVwqar9AroPdxwi3U","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytr_UgypGn-j860FB4fFm3x4AaABAg.9Aqp4a3pJ_w9AvI3mbJi1-","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytr_UgznxAkuw443DKHSxgZ4AaABAg.9AqmBjHItJT9AqsPbgk7pm","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]