Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
What if during the process of buying the car, you could pick what the car would do in advance? Some people would choose to sacrifice themselves, others to prioritize themselves, and some in the middle. You would get a mix of people so nobody could say a programmer had made it always happen one way due to an algorithm.
youtube AI Harm Incident 2022-06-02T15:5…
Coding Result
DimensionValue
Responsibilityuser
Reasoningcontractualist
Policyindustry_self
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugz3NDLJm5vOL8_5Ki14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugzzec3Twn63agGPyDB4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_Ugz8wJCpFoQ2L1TPwT94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxDg--Hfm2lG0jR6Ut4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxWoJcDFo_ekiyvEmt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx8hBTPSf8XBnRxR9t4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugw4jM93_9cAtGe9wgN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwCoTNgNzS8ucWLuet4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyKUDGVaTLJ7c09rdd4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgxIAyCois5Y25HZHYZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]