Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
On the subject of the social problems that would arise if AI replaced human intelligence in the workplace, it reminds me in a way of the collapse and necessary change of the economic system that occurred in the Star Trek universe, which culminated in a society where people didn't really need to work, but still did so because the focus had shifted from accumulating wealth to feeling fulfilled. But that was after the darkest moments of humanity, after resource scarcity, extreme inequality, etc, leaded to World War III, that according to Star Trek occurred in 2026! (Premonitory?) Although, curiously, AI itself did not play a role, because it had been limited decades earlier, but the franchise explored the problems with rogue AI on numerous occasions. But ultimately, we should ask ourselves whether the underlying problem is that AI will replace human intelligence in the workplace, or whether the problem is actually the system itself. A system where wealth and power are concentrated in the hands of a few individuals, where the accumulation of wealth is valued above all else, even to the extent of not doing even remotely enough to mitigate climate change. It is clear that in these conditions, if AI can replace workers, those workers will be laid off. But as was rightly pointed out in the video, who would buy whatever they are producing if people are left without jobs and therefore without income? The system would be unable to adapt to such extreme technological changes, to the point of self-destruction. And of course, seeking the best profits is what rushes the adoption of AI, disregarding any dangers that it could have.
youtube AI Moral Status 2026-03-07T19:4… ♥ 1
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[{"id":"ytc_UgxVQto9itb0fHqzt3p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzC1AK-R61DHVPUsmB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxPxCZy6z1-kZxfPtZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugwa0DWhc7cHnT5T86N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxKIGurfO5bgJ8RSo14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgznMp6f1ZcHvmLSLYp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgwnClS7V4gGOSop9zR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwmHrp1kyRIwtOp5hp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyzpMEl80ZR1qVEi7J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugz184GiU3QDxlQuQMd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"})