Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hoping you got approval before again testing your non lidar tech with our lives?…
ytc_UgxPQHh9x…
G
Tell me you don’t understand how ChatGPT works without saying you don’t know how…
ytc_Ugyh09MWP…
G
“If AI does everything, a single generation will be enough for the entirety of k…
ytc_UgyvD_Z22…
G
It’s funny, really. Literally everyone not financially tied to AI sees some sort…
rdc_o77384p
G
The only way it would be your art assisted by AI is if you wrote the model and f…
ytc_UgyKBIR6D…
G
I think AI will be more of an advanced collaborator, and will raise the bar for …
ytc_UgwFP7-Bn…
G
At this point we should wonder how easy it is to make sentient ai.
Humans have …
ytc_UgzKvaI2n…
G
which is why many institutions decided (long before the advent of AI) to no long…
ytr_Ugy1pazQe…
Comment
As a software engineer that has designed software for AI applications and embedded systems, I can assure you that no sane programmer would allow the vehicle to be surrounded on all three sides at any given moment. However, if faced with that situation (as I have personally been; due to sudden urban congestion) the correct response is to hit the brakes and hope for the best.
Note that even as a human, I'd never be so close to that truck as to allow it to kill me. Nor would I kill anyone next to me, just to avoid a collision. I sure as hell wouldn't code it that way. Nor would any sane company allow that kind of code. Better to kill the guy who had shit luck than to intentionally murder the guy next to him. No company makes a different choice than that.
Also, note that I've actually had an unexpected box fly out of the back of a truck and hit my windshield at 70 MPH, so I know what the actual issue is. I was certain that it would shatter the windshield and likely kill me, but it obviously did not. It hit the windshield (with me ducking) and made a horrible noise, but I survived and everyone around me survived, because I didn't do anything stupid. It was a "major issue" turned into a non-issue because of a cool head and rational behavior. Exactly the way you'd want a program to behave.
I just sucked- it up and hoped that nobody else behind me would crash as a result. It's called taking one for the team. If you can't do that, you have no business being on the road with other sane people. You sure as hell have no business writing software to control that behavior.
The idea that a car can figure out that a motorcycle rider has / has not a helmet is a total fallacy and red herring. It's highly unlikely that the car has any idea that a motorcycle is actually on either side of it, let alone the helmet issue. As a programmer, If my hardware could figure out motorcycle VS car, (currently not possible) then unless there are 3 motorcycles surrounding me, I would tell the code to slam the brakes and avoid hitting motorcycles at all costs (equal with all other pedestrians), because metal cars are much more protective than hitting raw flesh. Obviously, a human in a car (even the car I'm programming) has a MUCH higher chance to survive a collision, so that decision is a no-brainer.
This video is just fear-mongering bullshit. Any sane programmer would have no problem with this "moral dilemma", though current auto-driving cars do not have the hardware to figure this shit out; and won't for decades to come.
Have a look at the stats of google car collisions VS human collisions to get my point. Over a million miles logged by google AI and ZERO people ever hurt. That includes all of the dip-shits that ran red lights and pulled out in front of them. Case closed. You don't need to go into future-tech where cars are making value decisions about helmets to simply never have a collision, because good coding decisions don't even allow the car to get into the dumb-assed situations that people do every moment of the day.
youtube
AI Harm Incident
2015-12-21T06:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgicJ8o6vgL9vHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgjgjA3QBACveXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgiIRvaFLRy4BXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugi6wxkU3JS5u3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgjSjaD1amn_NHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ughy05zsMvO4YHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgjFM6BROUj5UHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UghkEkbZMbCpeXgCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UggNzTObvFdx33gCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugg5W6YbwRYNMHgCoAEC","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"approval"}
]