Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
One thing AI can't do well is detect genuine reactions from people. It would ta…
ytc_Ugxvy0_jY…
G
Ok but like you don’t have to use a program that inherently steals other people’…
ytc_UgyxpBpsC…
G
Well... The more AI steals from artists, the less you'll be able to sustain your…
ytr_UgwQFSWlB…
G
There will be no need for hollyweird celebrities in the near future. No more sp…
ytc_UgwYO7WvZ…
G
Gemini will never recover as no one will ever trust it. RIP GEMINI time to rebr…
ytc_UgwvfoYJx…
G
Just like cigarette companies are legally held responsible for not having a nega…
ytc_Ugx3TkGqV…
G
Even if Claude is more expensive, I will 100% support an AI company that just lo…
rdc_o80dj5u
G
Sabine: 2:18 *_"these people... have no idea how LLMs work"_* Ok. Here's the …
ytc_UgyJtrhLu…
Comment
The problem was not the RADAR but the AI. The RADAR would have picked her up (had the RADAR failed the AI would not have continued to operate the car). The AI is trained BY EXPERIENCE to predict what things mean. ALL of the AI experience would have been based on cars that had VISIBLE headlights while driving at night. THIS car was being operated at night with headlights adjusted so low that, while technically visible, any pedestrian would have inevitably have misinterpreted them as being much farther away than they actually are because all of the pedestrian's experience would also have been based on experience with cars at night not having headlights below a minimum threshold level and THIS car had a level far above it.
THE CORE PROBLEM WAS THAT THE HEADLIGHT ILLUMINATION WAS FAR BELOW THE MINIMUM LEVEL IN >BOTH< THE AI'S AND THE PEDESTRIAN'S EXPERIENCE CAUSING >BOTH< THE AI >AND< THE PEDESTRIAN TO FATALLY MISINTERPRET WHAT WAS GOING ON.
youtube
AI Harm Incident
2018-04-22T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"ytr_UgwaQzmgK_KKkiwOp-t4AaABAg.8ukQcDbbQIH8yPg9b_4PW-","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwESeYnecaolpcJ2X94AaABAg.8e66p0j8Dfg8e6J7Ck2xYv","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"},
{"id":"ytr_UgwESeYnecaolpcJ2X94AaABAg.8e66p0j8Dfg8fLfUgCb06b","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytr_UgxI1mfWYxQO_RzH72p4AaABAg.8e7f_Y4Iem98hAicM1w8lK","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzDtP4UH_xyWG2f-E14AaABAg.8e933x6Pdsh8eFB_Sk3_Ol","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]