Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If a robot can't feel pain, doesn't care about its surroundings as long as it can still perform a task with no wish to improve or vacation, and has no fear of death, and won't care if you unplug it knowing it will never be used again... sure it's conscious, but it's missing a rather human element which would dictate that it needs rights. At that point, isn't it just like an A.I. today that would just talk about its surroundings if you asked?
youtube AI Moral Status 2017-02-23T18:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionmixed
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UggjdW6J36gm6ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgjxFsSVeCDnM3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugggw7gD7mYXeHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgggWgOiFrTcO3gCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_Ugg7XZMoCWN3CXgCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"mixed"}, {"id":"ytc_UggI6IWQzP_T3HgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgiuuZL8zufHAngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgglkOmxDN21AHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UghwYK5jq-QSJHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgiwNbDwLt7DeHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"} ]