Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Interesting video. But already at the beginning you've established a situation that would NEVER happen if in fact that that car was automated. The car would NEVER get close enough where if something were to happen it would not be able to react as SAFELY as it could have if it had done something differently prior(not tailgate). If an automated car puts itself in a situation where IF the car in front's cargo were to loosen and fall, it's only options were to swerve right and hit a motorist or swerve left and hit a motorist, what a poor control system that would be. As a controls engineer at Ford, I can point out a few things that would not happen if in fact the self-driving car would be made for regular use. First we have to establish whether if at that time the roads allow self-driving cars AND regular manual cars to drive on the road together or if ALL the vehicles are automated. if it's case number 1, the automated car would have to take extreme pillow safety factors to take into account other drivers' "human error" and the unpredictable (tree falls onto the road). If it's case number 2, forget it, nothing will ever happen lol! The only way that accidents would happen in an automated world, would be if people did something wrong and against regulation or law (not properly securing cargo). Although these are great thoughts to think on philosophically and debate on, but in practicality, I guarantee that when car companies roll out automated cars in mass, there will be nothing to debate about. I'm not saying there WON'T be accidents, but the only ones that would be, would be freak accidents that were not taken into account by the controls engineers of that car and those would be far and few.
youtube AI Harm Incident 2017-06-23T03:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UggttszQdOIT0XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugjeinq77JWQDngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugi9KaGK7Pz36HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjT75c_hYfYXngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UghemGmrqvMpTHgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugg9VdcEV0AJu3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgiMecIFIV9nLHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugio-4_UVi4xP3gCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugh2ii7t431fIXgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UggBZZ06kKai4ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]