Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
just program the robots to feel pain if they dont do what they are programed to do... i know its evil but if the robot goes haywire and gains conciusnes and starts feeling pain he the thing that makes him feel pain will tell him to get back to work and if he doesnt starts making him feel pain
youtube AI Moral Status 2018-12-02T15:4…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgwxbGDsqCiSNuQhRfR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwVgsUMRlcixxZfw8p4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx631C12qaWO8ZV5vN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzT9okUcSlUw_n7VSd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyipJWvs2wfjFQCHOd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugz5JhTK_u6mxG2AHjB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwOauXWT3WGLwPcCXV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxYcPwVex6HZFsb2kl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzUSa2V7YjIYKavGqN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgxbAV0LZJaLxgRItnl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"} ]