Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
As others have brought up, I feel like this scenario is flawed form the get-go. This is assuming that self-driving cars will make the same reckless decisions as humans in following behind a truck too closely to avoid any sort of potential accidents. I imagine a self-driving car, unlike people, would actually do its best to maintain proper distance from all other vehicles when possible, thus avoiding a scenario like this altogether.
youtube AI Harm Incident 2015-12-08T17:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_UgizunohajILCHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjjWOUDi8MzcHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjQvTuYsrqOtngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgghF14lWrWg93gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UghSobsLJzKwTngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UghTsPIeRMcNT3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgiGpAhmNNMkf3gCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgjlPAxVCSrTmHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugju10Xr0tXdF3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugjc3KGPZNZyqngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}]