Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
One big problem - bridge does not make decisions. Or rather it does them in very predictable manner which allows us to anticipate the results. A self-driving car on the other hand makes decisions analogous in complexity to humans. It succeeds or fails in fashion that is equally unpredictable to a human. There is a reasonable threshold where the car is "good enough" that the manufacturer/designer is no longer responsible for failures, because they are expected to be rare or unreasonably hard to avoid. The purpose of (threat of) punishment and reward it to shape behaviour of living individuals in a way that is most efficient. That's why we don't punish/reward adults, experts, children or disabled in the same way for the same action even in the same context. In case of machines we may or may not have the option to shape their behaviour directly and punishment/reward may or may not have effect. Terms like responsibility, right, obligation and freedom gain much broader meaning when you are dealing with agents that don't necessarily have analogue to suffering or well-being, may lack or have superior ability to predict them in selves and others and may have alternative means of guiding behaviours.
youtube AI Moral Status 2017-02-23T17:5…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningconsequentialist
Policyliability
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UggYw13YsQ9UengCoAEC.8PKT6UCB8jL8PKZMQ9hbPu","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UggYw13YsQ9UengCoAEC.8PKT6UCB8jL8PLWJ9pwiZJ","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UggYw13YsQ9UengCoAEC.8PKT6UCB8jL8PLxw9x_HD-","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugg3KxPrjlzt6HgCoAEC.8PKSG_JAaDb8PKcX1h2yQC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgjfZ7jGW7cLn3gCoAEC.8PKRmVy9pn28PKWiE7fH7q","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgiiAT_fZcx3cngCoAEC.8PKRTi4Lfho8PKS7Fs_o5L","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugh1j66C9k7XO3gCoAEC.8PKRKm0qICT8PKS8yAWZ4y","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgjulCwo6IWvM3gCoAEC.8PKRGTCuwb-8PKSaGgm_Ry","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugjjxo2tMS6OcXgCoAEC.8PKQncQUTNp8PKXPtUoP-m","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgggfKyYxs8w4HgCoAEC.8PKQjIdrxlC8PKkf_u6fjb","responsibility":"manufacturer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"} ]