Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@zmaxx21 Unless the programming was done in negligent way, the programmers will always have the bounds of the law at their disposal. The rules for stopping at a stop sign are quite clear. If the person to your left at a 4 way stop who arrived at the same time you did tries to move that person has violated the right of way law - 100% liability is placed. The only shady gray area is for one off situations like what if a skydiver or a plane lands on the road ahead of you and your only option is to hit it or swerve into another lane and hit another vehicle. There are no laws for this. The self driving vehicle would have been doing everything fine until the moment the plane is touching down at a distance shorter than the breaking capability of the vehicle. And sadly the AI will do better than most humans. I have 30+ years driving without a single accident in Los Angeles county. Raced auto-x when I was younger and had reflexes. Most of the drivers on the road now are idiots ala Homer Simpson and driving is a task that takes secondary priority to posting on Instagram or shooting a Tik Tok moment. The trial lawyers will always go after the bigger pockets and that's the company that wrote the AI algorithms - almost never the drunk homeless guy throwing rocks off an overpass. The only way the engineers wont be blamed is if the company bankrolled the US Senator or Congressperson. It's not about preservation of life - it's about money. As a hospitalist physician I see people spending 100k+ on the last 5 days of their life in a hospital but they refuse to pay the $10 copay for an annual physical when they're in their 40s. It's never about life until after the fact.
youtube 2023-05-30T22:5…
Coding Result
DimensionValue
Responsibilityuser
Reasoningdeontological
Policyliability
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytr_UgyJTiSXsv2HxalX1Ld4AaABAg.9qLVnqQvkX29qLuH_UBKBx","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytr_UgyJTiSXsv2HxalX1Ld4AaABAg.9qLVnqQvkX29qM7yl9admv","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"}, {"id":"ytr_Ugxm7XNzKocMoy8ioWB4AaABAg.9qLUVuCCcx29qNBysve-Y7","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzqIwROMLlXrXOxkYB4AaABAg.9qLS5fQCWz39qMOQ_YeL1G","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"}, {"id":"ytr_UgzqIwROMLlXrXOxkYB4AaABAg.9qLS5fQCWz39qTtRsOIW2z","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"}, {"id":"ytr_Ugx0yEpS-DkEw_iu8Vx4AaABAg.9qLLqXKctLg9qLRQ1V4nYO","responsibility":"unclear","reasoning":"unclear","policy":"regulate","emotion":"fear"}, {"id":"ytr_Ugx0yEpS-DkEw_iu8Vx4AaABAg.9qLLqXKctLg9qMQb0beZIU","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxzmVvO8JsmqM6IYi94AaABAg.9qLJ7bNJFFh9qLSVhAtjAG","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgxzmVvO8JsmqM6IYi94AaABAg.9qLJ7bNJFFh9qLvhNXAGeF","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxzmVvO8JsmqM6IYi94AaABAg.9qLJ7bNJFFh9qO3emJHI2K","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]