Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
For good reason. If you had an entirely non-western team of scientists making a powerful AI, things wouldn't turn out good. You're gonna want some euro/anglo white guys on that team that have seen the Terminator, or else the Chinese or Indian AI project will decide all of nature is irrelevant and pave over the entire earth to make bullet trains. Not that I think they couldn't prevent that from happening, but I think they wouldn't care. I don't think the failsafes would be implemented, the idea of short term gain is more appealing to non-western countries to "catch up" as it were. Our cultures are so much more different than most people would care to admit. The separation of thousands of years between society has genuinely altered the psychology of different nationalities. He's absolutely right about all of the cultural aspects of technology. Some cultures are more used to these things than others and some cultures will just take longer to adapt to all of these new technologies. We already had movies about robots destroying the earth in the 1920s. We've known about this for a while. Has the average person in the developing world even considered that? No, probably not. And they shouldn't have to, it's not really relevant to their lives. They have more important things to worry about, and that's fine. And if they were against your cultures, they wouldn't be hiring Indians like mad. Most tech and IT companies in the west now are filled with Indian immigrants who get high ranking positions and start doling out jobs to unqualified cousins, so don't even start.
youtube AI Moral Status 2022-08-07T14:2…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytr_UgzyUC9vDRwoGUF8PK94AaABAg.9elTPegi30J9euprDXMrKp","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgzyUC9vDRwoGUF8PK94AaABAg.9elTPegi30J9eusz6CA5ux","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgzyUC9vDRwoGUF8PK94AaABAg.9elTPegi30J9euzAlBcmzF","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgxST4abrGwJbqD0KpV4AaABAg.9eRwXLFnbHn9eYDX0CVUGq","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytr_UgxST4abrGwJbqD0KpV4AaABAg.9eRwXLFnbHn9ebyiWsz5Am","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgxST4abrGwJbqD0KpV4AaABAg.9eRwXLFnbHn9fMscFfXm2o","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgyKRQoBLt6iYuTASJV4AaABAg.9eRiwSZYYdx9eTU73rs871","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytr_UgwbwxlL2z8rLzmeS_V4AaABAg.9eR0Ak9_VIe9eR4hq1EPv1","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzIWI1KIzqTzQKayWV4AaABAg.9eQHRBDxYCV9eR2gb5nFPs","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgxZdrKOZjkSgKUBbZB4AaABAg.9ePuIOTgsUA9eQ9uhXFtQN","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"} ]