Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Even though the question being asked is ridiculous, and the idea hinted at is moronic, I will play ball. First, let's note that somehow having a random reaction is better than having a premeditated action is really just a way of saying that people don't want to feel guilty for whatever decision is made. If the human reaction gets 30 people killed, I don't care about how you feel about whether is was ethical or not, I'll take the premeditated unethical action that will NOT kill 30 people. Regardless let's assume that the argument being presented was valid (the situation would be extremely unlikely for self-driving cars for 2 main reasons. Humans are the idiots that tailgate trucks and self-driving cars would leave enough space and have a faster reaction time and probably better breaks to deal with this sort of situation), I think the way to go is not to swerve anywhere, try to break and take the hit. Remember not only do you endanger the person you're hitting but everyone else in the area. Also, don't forget about the whole insurance side of things, if you get hit, it's the trucks fault, if you swerve then YOU hit someone else and you will be held accountable for that even if it was in self-defense. If this scenario does happen then I suppose having the car not prioritize your life is a risk you're going to have to live with.
youtube AI Harm Incident 2016-11-26T16:4… ♥ 4
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ught1n_JphkAIHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ughblm0pNV8VIHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugg6uIZz_KHiq3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UghYyYLOatuEE3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UggGzJ7uVI0X3ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgjiRGSJyVfKIHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgjZyeX-ZG4eSngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UghSFLJx8E5vKHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UghHDZCR7OhxLXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgiVWkZrEWtduXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]