Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Maybe let's not program everything with sentience; a toaster doesn't need thoughts and feelings on African geopolitics, factory automation systems don't need an opinion on environmental policy, and the vast majority of our AI systems don't need to know, want or feel things outside of the scope of their respective functions, or even be capable of feeling anything. If, for some reason you do need to essentially create a synthetic person, then they probably deserve rights, but I'd imagine that in the vast majority of cases, it would be unnecessary to create such systems in the first place. Sidenote: It seems that a lot of people are suddenly getting this video recommended, is this a cry for help by the YouTube algorithm?
youtube AI Moral Status 2021-06-19T09:5… ♥ 1144
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyindustry_self
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgxHU_M2M65WV4M8Zbp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxdlM8VhsXmuVkYe5J4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgzQ1YhKjX7sqr72vrh4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyhDAfx6rGlba3aBch4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugw_iXEhbh516Qm_sht4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz_JoK95inMBFOtktd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzfpeEyI2M39-l6OKB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgwhBD5HzlIalOItQcV4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugx38pH5GdE39ecFH5h4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugyg-mUobUOi1ws2Nd94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"} ]