Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
hopefully.. but they say you must keep a watch on the car, and I think a lot of these people want them so they can just take a nap and wake up at home. And the company likely knows this. Obviously, I'd want a car that could do that for me... but these cars can't be trusted as if you have a taxi cab driver (hopefully a good one, lol). Yet they are being treated that way. For now, it all seems to defeat a majority of it's main purpose, as most people's fantasy is to, now and then, have something to drive them somewhere... or freely choose to drive. Someday, maybe that will be reality. We have driverless vehicles now, but not the kind everyone really wants. So really, those, just don't exist in a sense that it's safe enough. And they need to be trained on curves? There should maybe be a data base of curves, as there are in GPS, to determine what speed not to go over. -- Another thing is that the systems failed, be it a bad cam, crashed computer, bad wires, whatever, it's just a car. How many people have driven in just OK, 15 year old cars, that work, but they have lights that mysteriously flicker, fail, or they are unreliable. What happens to cars like this when they have like 200,000 miles on them, 15 years old, and electronics that begin to slowly fail, blinking in and out, failing circuit boards, locked up systems etc. My cars light in the P R N D L was flickering the other day. It's old. It's got corroding wires. Then what? Old Uber cars failing more and more often? Sounds pretty scary. Do they retire the cars? OR the driverless features? I bet not. I realize the future could be driverless cars, but think they have a lot more thinking to do...
youtube AI Harm Incident 2018-04-02T19:5…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[{"id":"ytr_Ugyl6FWiLBm1fCWRLH94AaABAg.8e6GB6pbs458eHmwr6C3ag","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"},{"id":"ytr_Ugx_W5JP3_JJewiJ8-B4AaABAg.9AxIRXgY3i69MhZ6ctGuZy","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytr_UgzQY-6PDuflQmMc_3l4AaABAg.8e8Mtlx_QpN8kDsJiDaOQf","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},{"id":"ytr_UgzlTOd0BLMN6PciZTt4AaABAg.8e6pqh7MS3J8eAF7qHbVTC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytr_UgzlTOd0BLMN6PciZTt4AaABAg.8e6pqh7MS3J8eYelS91elH","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"fear"})