Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I’ve been driving a 53’ tractor-trailer since 2009. I know I would not feel safe driving anywhere near a driverless semi, even with a driver behind the wheel. 1, It brings your guard down to think it’ll do what it’s supposed to do. So, your reaction time isn’t there, in case the computer screws up. 2, Common curtesy is out the window. Say you’re coming up to a 4 way stop, with you and three other vehicles at the same time? Who goes first? And when there’s a pedestrian looking to cross that intersection, you think the car cares? 3, City streets are deadly. You have to be extra careful of pedestrians all the time and the majority of those streets weren’t made for 18wheel semi-trailers. You have to take up two lanes to make those tight turns, while timing it with traffic and the light AND pedestrians to pull it off. 4) I drove trucks with auto breaking that sense a car slowing down in front of you and I can tell you, they aren’t 100% fool proof. They can cause rear ended accidents by stopping too quick and because that driverless truck ain’t looking for a way out, but straight ahead and it’ll stop way to fast in bad road conditions, when a professional truck driver will anticipate this knowing whether or not they can safely take the lane next to them. There are way too many variables that driverless vehicles can NOT do safely like a human being aware of their surroundings can, let alone like a professional driver. You also forgot to mention the other driverless vehicles that ran red lights. Cars are still considered deadly weapons. Is there a failsafe way to know who was at fault, without infringing on personal space of a cam?
youtube AI Harm Incident 2018-03-21T05:3… ♥ 4
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugyux2lWWpEYzKceWU14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyHxwGcJPJeqscttmB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgyFhhdfpN4UKe-wlu54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwMsQi-49KG-ply2NZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxWtcsTlzuZ6jplJvR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzNV3JfGMbpeUxUZIJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwGLMWXP2oDBsmk63F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugz3vpJBhmWs00hXoBx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzeJQk6a40o8CT45bl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyN9I6hzvJpWgxt5IN4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"liability","emotion":"mixed"} ]