Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'd rather a person create something *with* AI then make it create everything. W…
ytc_UgzzGXMca…
G
Those saying he's a loon ... why doesn't google want the AI to take the Turing t…
ytc_UgzvSwlN4…
G
the fact that the first responce you get on google is from AI makes this 1000% v…
ytc_Ugywc9Xpd…
G
Bruh, I can’t stand AI artist. AND I WANT ALL THE SMOKE….They are nothing but ta…
ytc_UgxRPH88I…
G
4:00 the reason why it can't generate without theft is because it's not an actua…
ytc_Ugw5iR9bb…
G
I wonder whose worse, an army of indian coders or an AI? Everytime I have worked…
ytc_UgwiY65FZ…
G
Dinosaurs got wiped out by an asteroid
Ai will get wiped out by an rain 😂…
ytc_UgzD_U0iF…
G
1:40 well thats a first, Charles you a dummy, if you design a city and the build…
ytc_UgyVbTtgD…
Comment
There's no excuse for this one. I've driven over 100,000 miles as a taxi and Uber driver. You get a free shot at jaywalkers from a legality angle but they're avoidable. I probably could have gotten away with killing a hundred but I've never hit a single one. The riskiest is when you turn a corner in the dark and the jaywalker is not illuminated by your headlights. The first problem with this video is that the camera doesn't see her, but in real life the headlights were right on her. The driver would see her before the camera shows her. Even if she was only visible when 20 yards out as shown here, I would have swerved instantly even at the risk of bumping another car or going off the road. You risk a non fatal accident to avoid a sure fatality. I might need a 1/2 second to hit the brakes but turning a wheel is closer to instant.
The driver was at fault for not paying attention and the autonomous car failed to see and react as well as a human driver.
youtube
AI Harm Incident
2018-03-22T04:2…
♥ 15
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwwVjTh9NFjJvuMN4d4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz4viWoSK17RWk-2md4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwOIjISMnaEISw_Z9J4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwJUzDagvk24tFIoNN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwcEpBFlhHJa2NGPWR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwHFdrbPL6Leg6C2HB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwvLTvgcr37Uw9vRIV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyV019tyG9R0g7rw7F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyiyW43qT6sIpx0xW14AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgypBU9Wklx3CdNBpsV4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"approval"}
]