Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
In my opinion, it all depends on whether it's necessary for some reason, it sounds cruel but we kill cows because meat is a source of food, and we need food, simple as that, people often forget that the moment ancient humans started eating meat their brains started to get bigger and we started having more energy to do more at once, however it's not necessary to torture or abuse a sentient thinking and feeling toaster, you don't need to for your own survival, you can treat it like a sentient intelligent being, those examples from history you have were plain wrong and as a species we've learned from them and should by now know better than to make the same mistake if we ever to begin to deal with sentient robots, there's still economic problems though I suppose, if someone paid for a domestic robot and it become sentient I'm certain they're not going to be willing to just show restraint on something they paid thousands for, personally though I would take care of it and allow it to learn
youtube AI Moral Status 2017-02-23T13:5…
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policyliability
Emotionresignation
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgiMDrD_Vrtr3XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ughtx1nCpoQ4SngCoAEC","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgirAlPByFXXAXgCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgipRwhPaz5NkXgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}, {"id":"ytc_UghFbYbOTNyzhngCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"}, {"id":"ytc_UgjJuvM51gMYDngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjUDH2osqQ9pngCoAEC","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_UghKIUqvylSSt3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgiTpSDa_d0drHgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugh5Sf2tvcldOHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"} ]