Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Your question reminds me a bit of the recent ethical dilemma being faced by manufacturer's of self driving vehicles. In certain unavoidable accident scenarios, the vehicle may to have make the decision to potentially hurt people in order to avoid hurting a greater number of people. It may even have to sacrifice its passenger.
youtube AI Moral Status 2017-03-12T13:3…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyregulate
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgjLYJhHPMsUEHgCoAEC.8PtuUTIQEvX8PwSVSJcIhI","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgiVwjkV9NIQk3gCoAEC.8PrjymS4JXo8PsaiuqXSnb","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgiVwjkV9NIQk3gCoAEC.8PrjymS4JXo8PtiwpvP6H8","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_Ugi318Nm44FAC3gCoAEC.8PqhuLdkTCr8QUrkX658J7","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytr_UghZs-vx_DY4WngCoAEC.8Ppcp7Bn9RL8Pq9acEMmtW","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UghZs-vx_DY4WngCoAEC.8Ppcp7Bn9RL8PqvLGRI204","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytr_Ugh4-V51At2SPngCoAEC.8PouRlAd_Sn8Q_7dKAoD2q","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}, {"id":"ytr_UgggUPfJ9n2pw3gCoAEC.8PnRiYml4Pl8Q03__JyheV","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}, {"id":"ytr_Uggczad5RakHtngCoAEC.8P_l9quOfj68PacjLfPG-g","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgirsstFTRgqcHgCoAEC.8PWp2MxMP0z8PWqqddcvWJ","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"} ]