Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
There are a lot of possibilities with current technology, but Tesla and any other manufacturer need to slow down if the “AI” is not sure enough it has a clear road ahead. Two cameras are sufficient in determining distance to objects ahead. But in low light conditions there are struggles to all algorithms to produce reliable data. And this is known very well. But you can’t sell a car that is afraid to drive at night. There are a lot of alternatives to all systems. Auto pilot could just follow another vehicle while keeping a safe distance. Use the high beams, when cameras disagree, to provide a better vision. Even a cheap $5 laser can be fired to probe the road ahead. Or even better night vision cameras. And I’m sure an army of engineers can probably think of many more ways to make this better and safer.
youtube AI Harm Incident 2025-01-05T22:3…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningconsequentialist
Policyliability
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_UgzaEYvJE2SSYW9L2Kl4AaABAg","responsibility":"manufacturer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyzFpnNOkVcKmH_aaF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugwzp58DTN1IBX_8ERB4AaABAg","responsibility":"manufacturer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxGdm19dqdWYCXBTRh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"outrage"}, {"id":"ytc_UgywLlKfnvOanu3C2f54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugy4HRAiHIiHg30CIKB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgwHooEFPaqbXC2ePTh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzqXpcLVaK5rSykubZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_UgztIu2__P6nFjw0cbp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxNGo6TFyukcaQc9mR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}]