Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Situations like this could be easily avoidable, if programing is done correctly and tested sufficiently. A autonomous car  in theory should be the perfect driver, not tailgating or getting into dangerous or potentially dangerous stations. I understand the ethical implications, but the basis of these situations in my view are unrealistic.
youtube AI Harm Incident 2016-10-04T08:2… ♥ 3
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_Ugg_qQYiL1e7ZngCoAEC.8KuOXvYh4_g8usdC03iylv","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgjoShDj037kfXgCoAEC.8KR4axWrdhM8LkFC56rqis","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytr_UgibXv0DjUL4p3gCoAEC.8JlIj8ky6DL8svdbTQMNi0","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgjkE_sAeUxvX3gCoAEC.8JlIUnyTjbC8LngQgBWiIU","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UghLzE_FkYBRVngCoAEC.8J_IfFC-jLV8Jb617iUdC2","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UghLzE_FkYBRVngCoAEC.8J_IfFC-jLV8MdVxUdGhWn","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UghLzE_FkYBRVngCoAEC.8J_IfFC-jLV8NXNFEy7B_j","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UggUueruHXVu1ngCoAEC.8ITPi9Cd3jO8KK1ar9BS07","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UghvicV8X1yXv3gCoAEC.8Gms_hQ5am18IGKf6cpF1O","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"approval"}, {"id":"ytr_Uggfmpuz0HRxeHgCoAEC.8GmqqMWvOPf8LS9TqbsEnl","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]