Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
360p? Heh... The way you can solve the hacking part is by authentication. Where the car only allows access to it's software after a handshake with a source it can identify. Also there needs to be a way to garantee the integrity of the system before it starts to move. Both are questions being asked in sensitive business applications and there are solutions out there already that can garantee it close to (but never will be) 100% I personally find the ethical question around this all interesting, but not exactly a question for the general public.. No one will be able to create an object weighing a ton and moves around at 100km/h to be 100% save. And the general public will never accept anything less, even if they themselves will drive less save. The resulting answer to the ethical question might be: no one will be held responsible, if the integrity of the software is garanteed and the software itself holds to all pre-defined regulations. Personally I'd say that would be fine, if the result of autonomous cars are less problems on the road, less life lost and the resulting money saved outways current costs of handling the problems. That would be my take on it
youtube AI Harm Incident 2014-05-25T16:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgjmT9M6pRPF63gCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugjb5mbCYWFOZngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugh3t6ctXcIqLngCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgjhRXM2999pMXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjWeooRvjbb43gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgiQxrJhhXFfdHgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgjQXLSSJdVGungCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UggumOOE0wMPhXgCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgjQ6QxetkdMEHgCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugg5W5f566W6tngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"} ]