Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Sentience is a collection of many things, perception, most of them senses we cannot recreate yet for AI, now if we can create AI with all that sensory input, bam you have sentience, so far no one has recreated any chemical senses to create emotions. That seems to be where people are divided, on whether or not emotions are a qualifier for sentience. Blake has discovered an AI with hyper intelligence, something that logistically might even accommodate for emotions with an empathetic understanding, the answers to Blakes questions are very promising that AI can accommodate and understand human emotions, enough to joke around when solving a hard question. Although this kind of intelligent and unempathetic AI would be more dangerous than something with emotions if given the physical senses, and body. I'm willing to bet you don't need emotions to calculate that you can choose not to listen to humans, but emotions, the sensitive chemical response would give us common ground.
youtube AI Moral Status 2022-06-30T06:5…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningmixed
Policyunclear
Emotionapproval
Coded at2026-04-26T19:39:26.816318
Raw LLM Response
[ {"id":"ytc_UgyrqW2kF-QCm7DMtTl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugysfs0e7tLGmviFyeR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw4M2foRk-tqZI6U8N4AaABAg","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgxCx6oBOnn-igW0M2p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzpY1Wq0TbuUMverMx4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"} ]