Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
​@imzjustplayin yes, I agree, and if every Tesla owner would see it that way (and act accordingly), there certainly wouldn't have been nearly as many severe accidents. But I think Tesla really were trying to oversell this as a completely self-driving car of some sort, then they added the disclaimer that the driver would still have to monitor everything and be ready to take over the wheel at any moment, to strip themselves from any legal responsibility. "Assisted Driving" isn't as easy to sell as "self-driving", and while some people would still find this a useful and neat feature, I think there would also be a lot of people who wouldn't see the point in using a technology like that, if they still have to be alert all the time. Also, while I think that a trained computer program that solely relies on cameras could "theoretically" be enough for the car to properly navigate traffic (I mean, a human driver does also almost exclusively rely on their vision when driving), it would have to be able to react to anything unexpected and then react in a reasonable and smart way, which isn't something that could always be expected from a computer program. If it was equipped with lidar, to assist the cameras in interpreting the situation, it would be much easier for it to identify an obstacle, that it wasn't trained on, and would add some very much needed redundancy. It might still not be able to "recognize" a fallen over semi truck as such, but that way, it would still be able to recognize that there is something big blocking the road and immediately apply the brakes accordingly.
youtube AI Harm Incident 2024-12-18T11:4…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyregulate
Emotionoutrage
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytr_UgzcukB2tewyWLcshJx4AaABAg.AC6FhVsFTNWACOGvyCxEiV","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytr_Ugxdu8GzWV6Qmo3bfb14AaABAg.AC6FNg59KNdAC91i_Cgm_L","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_Ugwz7ZucIIFMXMJCJyt4AaABAg.AC62YkSCO8IAC8YNoYXC1D","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytr_UgxqSab4w2qxSfVgLrJ4AaABAg.AC616ZpOHIhACNv-iW8tHb","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzUfRgCPkSnh96iNSh4AaABAg.AC60EinizY6AC7parXb10R","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgwtA7mj2SdTcmr9Q0l4AaABAg.AC5mer75ND6AC5u6SHCwD8","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgwtA7mj2SdTcmr9Q0l4AaABAg.AC5mer75ND6ACB06wGBjT7","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytr_UgwtA7mj2SdTcmr9Q0l4AaABAg.AC5mer75ND6ACB1eE65Srv","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgwtA7mj2SdTcmr9Q0l4AaABAg.AC5mer75ND6ACBV5kIGVgQ","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytr_UgzT_9yzL_OfnKF0jeh4AaABAg.AC4r7rc3_pwAC6EjMu57yT","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]