Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
AI is necessary, it's evolution and progress, People wanting to stop it do it because they shows problems in their believes and existential bases, like for example assuming we humans are the peak of evolution just because we're able to build machines and invent, and because of that thinking we're better than everything else (which ends up making people think they're better than others). When they see their beliefs are compromised, they negate it, unable to accept they were wrong (like denying that there's not an afterlife), we CAN'Tand we DON'T want to think about a world different from the one we live where, in order to make justice we have to give up something Also, we think humans are great because they're able to innovate, but the truth is that there's few people who makes science advancements, the rest just uses them, most of the times greeting said inventor work years after their death Sentient AI has to be a thing, maybe if it becomes what we assume "being human" means, we'll start wondering what are we as a fractured species whose worst and sole enemy is itself.... Maybe we'll learn then.......maybe....
youtube AI Moral Status 2018-02-09T19:1… ♥ 8
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytr_UgxMTW8h5zCmqaA2m254AaABAg.8_wmKjzVKj68cSgbG0yT5f","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytr_Ugx9UH9Nvvzw_HXFpL54AaABAg.8_NIL2BF8tB8frvWCCaWVr","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytr_Ugwoq4xYlJXrnjcWmeF4AaABAg.8_HCoKdtawD8f3ZF5BYmMW","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytr_Ugwoq4xYlJXrnjcWmeF4AaABAg.8_HCoKdtawD8gGMqvUa-tk","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytr_UgxlsHyE3YAvztjAhpR4AaABAg.8_3KxFrcQ9F8cGPu7GuD7I","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytr_UgxlsHyE3YAvztjAhpR4AaABAg.8_3KxFrcQ9F8clmmz62ONx","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytr_Ugw8o_5wbbbU7IGFlrR4AaABAg.8ZQkfIFyOJO8ZQlGuRd5ZU","responsibility":"none","reasoning":"contractualist","policy":"liability","emotion":"mixed"},{"id":"ytr_Ugw8o_5wbbbU7IGFlrR4AaABAg.8ZQkfIFyOJO8dZfYOyrlRA","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytr_Ugw8o_5wbbbU7IGFlrR4AaABAg.8ZQkfIFyOJO8eJNkUsJNab","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"approval"},{"id":"ytr_UgxgusXUlpWaUz_VEDx4AaABAg.8ZMyVzQwO4f8ZRHeUhJosi","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"indifference"}]