Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You have to study the emerging sci-fi city Neom in Saudi Arabia. It is a fantast…
ytc_Ugw192HGF…
G
Too bad it’s in Boston where they’ll inevitably ruin it all by trying to make th…
ytc_UgxfcwOrv…
G
it was not auto-pilots fault, it was the drivers fault. as far as i know, it is …
ytc_UgwKWUxBN…
G
The ai filter breaker is me cause like idk how it happened but it did and my eye…
ytc_UgyLwBCJ5…
G
Big Tech firms are all owned by psychopaths, so why wouldn't AI reflect their va…
ytc_UgwYLIILo…
G
What about the parts of the world where people live form on emeal to the next? W…
ytc_UgxKI-OYu…
G
It's not just Trump. We have huge amounts of AI, digital surveillance, thought p…
ytc_Ugyavpihx…
G
people are really saying this shit? digital art is a trillion times better than …
ytc_Ugw4SXfaW…
Comment
I would say the Uber tech failed terriable but as with most accidents several parties share fault. The walker was in the road with no reflection material, the guy in the car was not paying full attention (typical for a driver but adding self driving makes it worse) but the lidar and radar should have seen her in time to brake so much more. Since the tech is Uber and the guy behind the wheel works for Uber Uber get most the the blame.
youtube
AI Harm Incident
2018-03-25T13:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwRetTsi4i0BqNRF114AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy9Q4IOXYexIL_Uknd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxcAqQObdXaeSzh81B4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwX_g2oZkBEcBYpK1x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugwo3xZi5Qa15kzDWnR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy1QL1_yfOFfFRBVvh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyTduAF4Rg9I0AUbUx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzsDN3E4w5XR8_3azJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwm7oRC8jx_I495dNx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwr0t6NT-Q7TyAiMbd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]