Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
We lived in Arizona for 9 years. There was a company that had driverless cars. Well, there was one incident where there was a person sitting in the drivers seat but they were just to observe what was going on in front of them. They weren't supposed to drive. Just monitor. One evening there was a woman with a bike crossing in an area where there wasn't a crossing. She wasn't crossing at a light. The driverless car struck her and killed her. The car has some sort of radar tech thats supposed to sense when there was an obstacle in the path, and stop. The person in the driver seat was looking on her phone. Now, we would be stupid to think that if this person who is getting paid to observe what the driverless car was doing isn't paying attention....why would a driverless car owner or a driverless truck driver, pay attention. The idea is that they shouldn't have to pay attention. The car should be able, through technology to navigate, flawlessly, the road in ALL circumstances. Right?! If a driverless car is prone to mistakes, whats the point?! Mistakes ....we as humans have that covered! I realize that they were testing it at the time but it seems that they should probably have gotten the the kinks out before putting it on the road. Shortly after the accident you didn't hear or see much of the driverless cars in my area. Clearly, they think they've worked out the flaws. Hmm. We'll see.
youtube AI Jobs 2025-06-02T12:4…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugx1Hory0LPb-fDR7bV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzOYSuIC1QIyxVZuox4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwm_VZB80F0TOTdafB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugwkh5OCx1-2VM2hN2t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugwfce-irQyVgekw6yN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxTn5SQUTLrbjXJ-OV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyMazUDF6T6ZI-wLxd4AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugx0kql3NpIaibXNu_V4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugxo0XY-qrXh4DPZEJR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzoGWfdq5yI5n6nD654AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"} ]