Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Here are my thoughts on the spot: 1. Don't most of the self-driving cars teach themselves how to drive better? So it may be no ones fault. 2. Isn't the whole idea to have basically every car be self-driving and communicate with each other? In that case, your car tells the car behind you that you are going to brake hard so that car in turn brakes (although it could be a problem while self-driving cars aren't wide spread). 3. Drivers are supposed to still be ready to take control of the car at any moment. So I would see the car braking and then the drive grabbing the wheel and serving. However, I do believe it should be the car manufacturers responsibility to inform drivers about situations like this so that drivers can actively avoid getting boxed in or getting into any other dangerous situation. 4. Self-driving cars could be programmed to recognize situations like this and issue a warning to the driver to change lanes or something before disaster strikes. 5. If it really comes down to it, the most ethical way might be to have the car recognize the situation of whatever action it takes, someone is getting hurt and have RNG decide between serving to either side or braking (since I doubt any human would just drive right into the cargo unless they didn't have time to react).
youtube AI Harm Incident 2015-12-10T06:1…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningmixed
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgiJaQs6F28eWHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UggJ82QW9q6Yh3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UghnlhSnEQZ0IngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UggjXP4s7034gngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgjMw5uEv4uP13gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UggU7UUEmbYyYHgCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjNuOWAcDkP3HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgisvA4COAatfngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgjTfq8djgy0rHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UghuJ8ET5_X-j3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]