Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
That's why you make you follow the law that says three to six car lengths behind if you see a truck with something on it and it's not strapped down or anyting just using momentum to keep its items in place State like 9 car length behind it let people pass you that would be what I would do and that's the reason why you don't have to Swerve unless the truck stops all the time then you have to slow down you and then you slam on the brake it's probably hey when do much damage if the truck is stopped and is flying from the momentum of a stopped vehicle self-driving cars can be programmed in a way where that works were they keep the maximum distance between them in the vehicle in front of them using a math formula that would require it like they could use the laser to scan how big the vehicle is what it's doing and then based on how much they would need to slow down if it is carrying something if it is hauling something in that ate them with that would fall out what to do including small cars where children usually throw stuff out the windows that would be an item if it can see into other the other car like actually inside it and see that the windows down is child by the window in the item Within Reach that can fit the window it would know to slow down to this degree so it could not get hit by the item at a rate that would do any damage to anyone besides cart self there have to be some type of mathematical formula where if you are this distance behind the car you can stop at this rate and be safe plus you got to keep in mind people pull out in front of other people so the car has to make immediate decisions because another car might just pull out in front of it and if it's going 60 miles an hour it has to no how to immediately stop
youtube AI Harm Incident 2019-05-13T19:1…
Coding Result
DimensionValue
Responsibilityuser
Reasoningdeontological
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgyReg2RJcQbRU8fXqh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzFCEuEWdDiAtznUXV4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugwg9zPgDxoVHbvC0MR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxyNM-AKHsF-2MXWWt4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugx8Kcl0btcr5I4ySJx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz6kL7XHAZLi2NywJZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwgRIau2zSrD54ZIb14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwgZBpAS47AyZs-L4R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwgfBkZSlB2KJ9236h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzgpFP6xoDLiHG7IYB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"} ]