Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It seems to me that since the AI is based on the neural network modeled after the human brain, the way you make them so they won't want to hurt people is now figuring out how to make them feel like humans. Guilt, elation, depression, happiness, contentment, etc. These things aren't just emotions, they FEEL a certain way which, in my opinion is 100% of the human experience. So I have to disagree that they will have emotion. They can't unless they can have the feelings they are named for. Logic is not emotion. Being able to assess a situation and recognize the need to flee or whatever the case, that is not the same thing. I think humans have these emotions because of our bandwidth issue. It's how we feel that help us make decisions and if we had to rely only on logic, our ability to process data by way of exchange would be a hindrance. Our intuitiveness to pick up on things we don't otherwise know is part of the human experience that a computer can't ever have. Not unless we figure out how to make them feel, and then they will have understanding of values and moral code and a deep understanding of the human condition. This is how you ensure relative safety to the human race.
youtube AI Governance 2025-06-16T12:0…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningvirtue
Policyindustry_self
Emotionapproval
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgyAiTOedrBS8WNTDGd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxqLOJHMpGxwaQbFtZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyY04CCzB8EuCV5_bF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxTvpJrg-VRAsZ6zpJ4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugyf_7ygdN7dVADAw6B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxKQ8402Egi5bDRRfF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"}, {"id":"ytc_Ugz8_ThM8byOBjplkQR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwfkGfYHhmjfE6sTPd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzB8GwtjR1rjEJbOhR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwnAFwiAX2Nn3_VMhV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"} ]