Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No, they don't. Artists observe and learn to create works that are derivative bu…
ytr_Ugz1vNra0…
G
I mean that's kinda a shitty way to look at things though. you could say the sam…
rdc_gtctjpr
G
The real problem with AI is that the tech industry is basically a cult at this p…
ytc_Ugw3yFBxg…
G
I'm hoping it will prevent us from seeing deepfake revenge porn of Ted Cruz. On …
rdc_l98lx24
G
Lawyers EXPLAIN it. You get one malpractice case go against 1 radiologist the In…
ytc_Ugz-KS1nl…
G
Musk also warned about AI. Forgot his exact words, but it didn't sound so good.…
ytr_UgyrJ6fVj…
G
It's funny that a writer, a creative, would replace another with AI. Honey you'r…
ytc_UgwSJ6RMC…
G
We're dooooooomed.... destroy your electronic tools before they come after you..…
ytc_Ugw6vrbcY…
Comment
Three things: First, crossing the road, on a dark road, without any reflective gear, and without looking to see whether cars are coming - the main blame here goes to the pedestrian.
Second, the time between when she became visible, and when the car hit, is one second. That's barely enough time for an alert driver to start depressing the break pedal, so while the "observer" in the car is to blame for not paying attention, had she payed attention, it wouldn't have made much difference.
Third, an empty, unlit road - why doesn't the car have long lights on? Shouldn't the AI recognize the lighting situation and switch them on? And if the AI failed to do that, shouldn't the observer have done it? Had the long lights been on, and the observer be actually observant, there's a chance that an accident could have been avoided - or at least there would have been enough time to break significantly; and to sound the horn to alert the pedestrian to the danger.
youtube
AI Harm Incident
2018-03-22T12:4…
♥ 6
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwZGJQVFhhfAoxyYbp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyruTP4wNiUY9POttJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx61-v8wPFo15TLnm54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw7nL9sDBF_EsRuC4J4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwUQZPD0JvYj892xkZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwmHjPsc_hvVK0MCah4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxI0QdUij0_B8-T4Cp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxIgOSX75DZyc4h6wR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyqtI8Flut6myfr0jV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzq2JwLhXDXAuDZghl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"}
]