Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It looks like the programmers really don't know how to drive very well. After all, these self-driving cars only do what they are programmed to do. And they weren't programmed to stop when a school bus comes to a stop to load or unload children. Who'd of thunk that leaving the programming of a self-driving car to people who are not the best of drivers would result in self-driving cars that were not so good at driving? Here's a hint, Waymo. Hire a professional driver to provide you with the core directives that your cars need to follow. Also program in the driver's manual that people are given to learn in order to pass the drivers license exam. And create a decision-making algorithm to determine when it's permissible to break traffic law. For example, which is safer... driving the speed limit on the highway or driving with the flow of traffic on the highway, even if it exceeds the posted speed limit?
reddit AI Harm Incident 1765223831.0 ♥ -2
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_nt078e5","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"rdc_nszorut","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"rdc_j446br5","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"rdc_j3wvoeg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"rdc_j3y8nqt","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]