Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is dumb, fake and has malicious intent. Large Language model don’t have not…
ytc_UgxgqRGVh…
G
This is hilarious. It's just total rubbish. Around 2:47 the narrator says (parap…
ytc_UgydJL7Bi…
G
The irony is that all of the "ai bros" *have* drawn pictures, etc, but most of t…
ytr_Ugw-UuVrK…
G
@AnthonyKing-e7t4fI don't know if AI is a hoax. But if it's not, we can't leave…
ytr_Ugx3iYiKF…
G
That is because new jobs and things came up that allowed people to still work, u…
ytr_UgzMzJO6m…
G
I live in Los Angeles. Many human drivers here are a danger to themselves and ot…
ytc_Ugz3XLbWu…
G
Isn't Bernie repeating the doom and of gloom rhetoric of the 19-th and 20-th cen…
ytc_UgyzizNKJ…
G
If you think companies will hang on to headcount to create a consumer class then…
ytc_UgzDUXo6W…
Comment
The woman never saw the car coming because it was driving in "Autonomous Mode" with the headlights *OFF*.
How then to explain what you see in the video? Here's a neat trick. Take your TV/Stereo/whatever remote, look directly at the LED in the end of it and press any button. What do you see? Nothing, because the human eye can't detect the infrared beam. Now, take any digital camera, (most cell phones will work except iPhones - I don't know why). Look at the led through the view window on the camera and press the button again. What do you see? The digital camera "sees" the infrared and interprets it as a white light to your eye, *just like the digital camera that took the video of the accident*. What you see in the video is the infrared scan that the CAR SAW, not the driver, nor the woman crossing the street because the headlights were turned off. Notice there are no "Hot Spots" in the scan such as you would get from two separate headlight beams, just a "stripe" in front of the road.
youtube
AI Harm Incident
2018-05-24T06:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzZSnnfj59UOcWUNUd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzMUqIxyPPq82ZPwH14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgznW18G3AMIE_uCDFB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwtM4pOivcujKaP98B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgxXlsYlR39y7k7YXLl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz0LPFwCBLHB8OSH4d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxKr-UyshtI6A1hDOd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxKQZ3qrwjF4MfFJGl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzBsJ6NOsaDQOsdEPx4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugw6xrdTx1uwcPo-yZ14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}
]