Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
You say "program them to feel pain" but isn't programming in an aversion to danger essentially a pain response? If you program something to avoid things that are damaging it, what makes that different from 'pain'? Infact, what makes current computers not 'conscious' to a small degree? As for robot 'slaves'. Just program the AI to naturally *want* to do the tasks. If you program an AI that wants freedom, why are you using it for manual labour? You'll find all of nature programs us to want to do the things nature wants us to do. We love eating. We love reproducing and spreading out. We hate things that stop this. You can argue the ethics of conscious machines, but I don't think the 'slavery' aspect will ever need to come into it.
youtube AI Moral Status 2017-03-04T15:4…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[{"id":"ytc_UgiA1_INbJOFTXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugi1nrPKExbHOHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UggFGTUIov_oOHgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UghdOolC8joZ6ngCoAEC","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"}, {"id":"ytc_UghAw59QZBitCngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgjT-wD9PuFMo3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UggRBlCDj7mB73gCoAEC","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgiLuFIX4HCn7HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgjcZKTKJEoieXgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UggVd289Q9KLTngCoAEC","responsibility":"unclear","reasoning":"deontological","policy":"none","emotion":"indifference"})