Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Why do we need robots? Are we just too lazy to do things for our selves? Robots in factories are just programmed to do a particular job. They are bolted to the floor so don’t represent too much of a threat. The guy in the hat is almost the archetype of the crazy scientist. Perhaps this is deliberate, but I think it failed. Judging by the reaction to the top comment most people are not at all keen on the idea of having lots of robots around. If we want to benefit from AI, why do we have to have them look like a human and have the ability to walk around. What is really terrifying is the prospect of military robots. Apparently these have already been developed. If the are given autonomy to decide why to kill, we (humanity) are in big trouble. I hope humanity will reject this technology.
youtube AI Moral Status 2023-02-27T22:5…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgzFWqB4f5gRaYAPm3p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugx5DrV3IGcOq6uE0wh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxflvUW9y-vTCjv0xx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwgY9WsEXpK8k2bceF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyAv7YKaDo-Xv8bqWh4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgzlMTNITbgksHdl_Xl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzQIeN_wQ1dr2YdHzp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzsLvLoV3SLtidN8_x4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxnJSbYJSstKLGn9Kd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgxhvkLhvqCdM1JM0v14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]