Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So basically, a guy who ALREADY was crazy .. went online, and misused online res…
ytc_UgyzUwlnI…
G
AI just openly telling us its going to destroy us is wild. Also where are the so…
ytc_Ugy-vEoNG…
G
The problem isn't AI, it's capitalism. The goal of life isn't to work, it's to b…
ytc_UgwIY3vyG…
G
I stand against ai art because even though we are told to do things that other p…
ytc_UgyeOtFMZ…
G
No, you're wrong. Computer science is being totally transformed by LLM use. Very…
ytc_UgxC3zUFV…
G
I'm going to say something that might be seen as controversial.
But that part a…
ytc_UgzwF28fq…
G
I use AI every day for my job. Maybe I'm wrong but I really don't see robot plum…
ytc_Ugy6x7JXp…
G
I've been using AI for about a year and a half. First of all, as a tool, the way…
ytc_UgzYVmwNz…
Comment
To be honest I don't think I could avoid the collision even if I am focused on driving when the person suddenly shows up on the dark road like that at that speed.
If I have to choose whom to blame, I would choose the technology failure. It is the easiest thing to blame and can force engineers to improve the outcome for similar situations in the future. I think the algorithm for such situation is to anticipate the potential obstruction on the lane where the car is more inclusively (any object that moves on the lane and on two lanes of each side of the car will be included).
One theory I have is the LIDAR worked and saw the woman but the speed calculation / estimation for her was not done correctly or in time for the computer to decide to react differently. This is a tough area for computers: translates and interprets the moving objects into effective decision making factors. Yes it sees the woman, but for some reason it does not think she is going to be occupying the same place as the car at the same time.
youtube
AI Harm Incident
2018-03-24T16:1…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwRetTsi4i0BqNRF114AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy9Q4IOXYexIL_Uknd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxcAqQObdXaeSzh81B4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwX_g2oZkBEcBYpK1x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugwo3xZi5Qa15kzDWnR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy1QL1_yfOFfFRBVvh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyTduAF4Rg9I0AUbUx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzsDN3E4w5XR8_3azJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwm7oRC8jx_I495dNx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwr0t6NT-Q7TyAiMbd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]