Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People will have too much free time. Crimes will increase, an increase in bad me…
ytr_UgyilDWZ6…
G
We appreciate your humor, but it's always important to remember that AI is desig…
ytr_UgzTZqdl8…
G
the ~~Chinese~~ west buying property in ~~Vancouver~~ Thailand is a travesty tha…
rdc_dy88st1
G
I understand the story regarding deepfake porn is important, but I don't feel an…
ytc_UgyD1VyXN…
G
Nothin silicon valley produces ever is used for the benefit for man kind. They a…
ytc_UgzTxkM1m…
G
AI is just images spliced together. The human did nothing.
Art is all the huma…
ytc_UgwHPKPtf…
G
The modern loneliness crisis is really boiling down to addictive technology taki…
rdc_ohy85do
G
I don’t mean to sound uneducated but how is this a bad thing? Sitting in a truck…
ytc_UgyXEpUFZ…
Comment
This is clearly a autonomous mode fail. The car had no business driving over the speed limit at night (algorithm fault). Both LIDAR and radar should have easily picked up such a target, even without any lights at all (slow moving, large cross section). Even the camera should have provided enough data for the car to TRY to react (it can react almost instantly). To say that we couldn't possibly have expected more from the car in this situation is completely ridiculous.
The officer that has made the comment is no technology expert. He was shown a footage from a camera with frankly terrible dynamic range and he concluded that if he was driving while looking at the camera feed he would probably do no better. The point is that this camera does not have a better night vision than a human eye (seems a lot worse) and that the car shouldn't have been driving relying on that camera feed. For a multi-sensor equipped robot, this should have been one of the simplest dangers to avoid. It's a cars fault, unless the data from the other sensors show that the woman was waiting in the shadow in the opposite lane and that the car concluded that she would continue to wait for it to pass, and then in the last second got in front of the car. I don't see any other explanation apart from multiple sensor failures.
youtube
2018-03-22T09:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxX5YFJI2vgy_Zyhyt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwy27zHitEZBoXDeJ54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx98EBWYallMqPPMvh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxiKooY4rKZoh0SSRt4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx5G_XUW7ns7-yMxrR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxTD9zju23ZKhg72Ax4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzj9kIO3ZzulRYXqOd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwj5yEuavPRmLqR-4R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzdJSUrC_0JQqSHhj94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxlsnAQvcFMG9H9p5F4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}
]