Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
We might give them pain because god gave us pain in order for us to survive and we might need to do the same in order to make sure they would survive, for example in order to make sure humans in a self driving car survive by giving the robot pain it would avoid crashing more.
youtube AI Moral Status 2019-05-30T14:0… ♥ 1
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyunclear
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgzaERbUY6aNv0bDWn94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyKYdbdZLIQtogqrGR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzScYccHO5Bt5h9B714AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyesJQ9EnB4XZnPZyF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugw6S4JWM68GaEcAKa94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgwsK3U1Js6lsqygvYZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzNmFP6KjDf5Rwnv6Z4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwCFn4HpjcCAWJbW214AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"}, {"id":"ytc_Ugyc49PobndzcEhmq7t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzPhNMxQ5gyc3xTg8l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"} ]