Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The thing is: artificial intelligence is not intelligent. AI usually operates on the basis of repetition and probability. When people say that soon we will have AI moral issues, they are mixing up quantity with quality. A machine cannot become self-aware because it cannot experience the world. Consciousness is not the same as intelligence. Animals are sentient and not necessarily intelligent. A sentient being has too many data input from the tangible universe, the real world, via a myriad different senses. Machines have few data input channels and these do not describe reality around them, but specialised information that it is incapable of giving any meaning to. Animals experience reality and they attribute meaning to things. Food is good, poop is bad, eat food, avoid poop. Hunt, survive, sleep, repeat. A machine doesn't require such routines. And animals evolve. As they multiply, their traits are automatically replicated if they are useful. Machines only evolve when we dictate they should evolve. They cannot perfect themselves independently, so they are stuck in their original design. AI is fiction! Grow up!
youtube AI Moral Status 2018-10-11T19:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugx2df_rpyC7-9zbkWZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxjLpuKrucXwLR_UXh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxPiq_VfTE_dcaTjsN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugzf73Srn0Gs-21yzgp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxxhV09OVma3ss33cB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzsCaN9gu2o8mRX0ZR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwaLBLZfSlIqXmC3LB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxMG1YSOIiCRNYYBXN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzSoECK7c4O7aX38Eh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgzNqJLu-kc6KxSpfVN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"} ]