Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
There is a simle logic why AI can't get consciousness. They can't feel. You can't feel if you have no body and nerves and a brain. So AI can manipulate people, but its resistant to manipulation, because to be manipulated requires feelings. Its a very, very dangerous thing. It can never be similar to a human beeing, exept it will somehow get into a humans body, like ghost in the shell but reverse. But why should it? We are so complex and vulnerable because of our feelings. That's why we think we need to create something that has no feelings at all, we think its better than us, but its not. We will never understand AIs decisions, because they are based on logic, not feelings. AI has no moral, it can't think for itself its like a calculator but with words. The human species tend to think its better to have no feelings. We hide everything from each other and the more we use AI the more we hide from each other. We need to relearn how to connect, but the thing is, that every connection starts now with internet. Its a shame how our skills for connection get lost since we have internet. AI will get this to another level.
youtube AI Moral Status 2025-06-07T12:2…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningdeontological
Policyban
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgxRqGeetV9Ig2SEkP94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzdrbqeebSCq5SkuXp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzP15OFn-XTNzJM_hN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugw-6jCtG6gd48qCN5x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzW-CvRbNs9FrpZ0cl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"}, {"id":"ytc_UgyCcSIJE4X9D0vwFyl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgyUthQ9y8zbAk-kiAF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyyqneBPpgH68vY8Bd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}, {"id":"ytc_Ugw99kfoBlIhXropZgN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgzTqyrYKPGEMXgUcU54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]