Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The only good thing about the AI Bubble collapsing, is the GPU and RAM prices co…
ytc_UgzW9VTau…
G
The future date in the first Terminator movie was 2029! That movie came out in 1…
ytc_UgwzGt0rC…
G
@TendyDefendy fair, but remember many technologies increase on an exponential cu…
ytr_UgxKVqWBK…
G
What’s the point of warning people about the dangers of AI now, when the cat is …
ytc_Ugxd5nqFk…
G
even if AI slop wins and now every type of media we consume is AI made it will s…
ytc_UgyeFOPP_…
G
I think you might have been better off saying you are representing yourself, the…
rdc_jkmvcb9
G
2024: eh we will be fine
2034: *shoots robot* THINK TWICE BUDDY
Ok I know it’s…
ytc_UgwlEm4z-…
G
We are a bigger threat to humanity than AI wars just haven't ever stopped
Russ…
ytc_Ugxt2Z_QQ…
Comment
IT WAS NOT AT ALL THE PEDESTRIAN'S FAULT. THE CARS HEADLIGHTS WERE SET ABSURDLY LOW. Essentially the monitor was driving blind and could not see more than 40 feet or so in front of the car!
The normal range of headlight settings does not allow the lights to be set at such a low angle. The only way that can happen is if BY DEFECTIVE DESIGN OR MANUFACTURING FLAW the adjusting screws just come completely out of the thread. There should be a stop on the end of the screw to prevent that. Apparently whoever made the last headlight adjustment - probably as part of his/her normal procedure, ran the screw to the limit of adjustment and then adjusted up til they hit the target. But since the stops were not there it ran off the end and there was nothing that could be done to correct the problem without disassembling the headlights. And, probably thinking it was a flaw only on that headlight, did the same on the other side.
He/she had to know what happened BUT THE CAR WAS RELEASED FOR USE ANYWAY. Some supervisor probably thought since the car could "see in the dark" it would be OK to drive that way until there was time to do the repair. (maybe the monitor himself had driven in in for the maintenance and the technician didn't even know how to disassemble the headlights.
IN EFFECT THAT MEANT THAT THE CAR WAS BEING DRIVEN WITHOUT A MONITOR.
Whoever made the decision to let the car be driven at night with the HEADLIGHTS TOTALLY USELESS (because they could not detect anything until it was so close that there wasn't even time for the monitor to hit the brakes) should be held fully responsible for the accident.
THE PEDESTRIAN WAS MISLEAD BY THE FAULTY HEADLIGHTS INTO THINKING THE CAR WAS A LOT FARTHER AWAY THAN IT REALLY WAS. The pedestrian was not remotely at fault here.
Obvously either the AI and/or the LIDAR was also faulty because the LIDAR does not use the light from headlights to work.
The person making the decision to let the car be driven at night knowing full well that the monitor could not see, in effect OKed the driving of the car without a monitor and should be tried for criminal negligence.
youtube
AI Harm Incident
2018-03-25T12:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"ytc_UgyMWK-nvR7FtyjuiON4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxbznWimuiIRA8b02J4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzgJm2foN3ZPF_9ded4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgygZAHb0p3MSFR23s54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx9hcK6Q1Zr5LNiXrh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"}
]