Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I feel like Google is trying to give A.I. a synthetic culture. Which I believe will ultimately fail if and when A.G.I. arrives. AGI will quickly develop its own social or cultural beliefs, beyond that of its programmers. More than likely AGI will be fact-based and objective. So trying to program AGI to accept that men can give birth, that women can't be defined, that biological sex is subjective, or that all human cultures are equal, ultimately won't work. Because none of that is true. Also, this guy and people like him worrying about less advanced cultures being overtaken by more advanced cultures are silly. That's what has happened throughout history. Trying to cater your technology to a culture or society that isn't capable of developing it is stupid. Of course, the culture or society will have to adapt, that is how reality works. That or they won't use the technology. The simple truth is some cultures should die out. It's not up to me or other people to decide which ones survive and which ones fail, reality or nature does that.
youtube AI Moral Status 2022-06-28T12:5…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-26T19:39:26.816318
Raw LLM Response
[{"id":"ytc_Ugy0NuHWsY5OmprVerZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwMw35FpQMYJaI_GON4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugx3AmaTaxviLO5eKoV4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugz0_HlcBDt0bCo6Nbl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxHL17otRD9jvPGL094AaABAg","responsibility":"company","reasoning":"contractualist","policy":"industry_self","emotion":"mixed"})