Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think that the fundamental difference between AI and a normal program is that whereas normal programs are sets of instructions an AI is a set of goals, simulated by a program which is still just a set of instructions. To get a non artificially intelligent robot to do something you have to give it specific instructions, to get an artificially intelligent robot to do something you somehow communicate to it the goal you want achieved and it figures out how to achieve it by itself. So teaching these robot ethics and treating them ethically would be simply practical, I mean you could torture an AI into doing what you want but wouldn't it be so much easier and safer to make it want to serve you?
youtube AI Moral Status 2017-02-23T22:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgjUKMnhflFwrHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgiltTSEWD_SEXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UggEVfo-0BT3v3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UggxUeCR4fvePngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugj-C0VSwgP-VXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_Ugiu3igcszow23gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugg9D6n1e0Y6IngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgiLfvLZG9z0PHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_Ugg8zOaOKpgfSXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UggejCERUBBXa3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]