Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Hm... I don't think some silicon and memory cells can become conscious, computers only know how to do one thing; follow instructions. While powered, they'll do that. They can slow down or speed up, but they can't stop following instructions from memory. Meanwhile, a human brain is so complicated we are yet to fully understand it! And remember, we created the computers. We can make algorithms that learn, make an artificial intelligence, but it wouldn't be conscious, it would be, after all, a pice of silicon and a bunch of memory cells just following instructions...
youtube AI Moral Status 2017-02-23T13:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgiMDrD_Vrtr3XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ughtx1nCpoQ4SngCoAEC","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgirAlPByFXXAXgCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgipRwhPaz5NkXgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}, {"id":"ytc_UghFbYbOTNyzhngCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"}, {"id":"ytc_UgjJuvM51gMYDngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjUDH2osqQ9pngCoAEC","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_UghKIUqvylSSt3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgiTpSDa_d0drHgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugh5Sf2tvcldOHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"} ]