Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If you want to drive...drive, if you don't? Hire a cab....this AI crap is deadly…
ytc_Ugw5yObIX…
G
Whoa Whoa Whoa. We are only allowed to hate on America in this sub. Not so fast …
rdc_da461xg
G
I think that Hanson robotics obviously hired the most "relaxed chilling on the b…
ytc_UgyhdCUBy…
G
If these moronic ceo's come asking me to fix their bullshit AI code, you can bet…
ytc_UgynUg7xE…
G
hear me out: AI "could" be an incredible tool for artists and non creatives alik…
ytc_Ugxs0FZ15…
G
Hey NeedzTz! Thanks for watching! If you found Sophia's insights intriguing, you…
ytr_UgxjtPNiV…
G
Wow. Even PhDs can be not very bright. Using AI in the first place. Being polite…
ytc_UgyrBr0rS…
G
@knightsnight5929 No car company has anywhere close to as many miles driven wit…
ytr_UgxiuWGvq…
Comment
Autopilot totally contributes to accidents in two ways - first by being blind to hazards (such as above), plowing straight into things and second by allowing drivers to become unattentive to the road and conditions. It is human nature that if a car has an "autopilot" or "full self drive" they become distracted, they play on their phones, they look around, they stop looking ahead because the car is driving them right? Wrong. And Tesla has a minimal lame method of requiring the driver to demonstrate attention - hold the wheel. Not only does it not ensure attention but it is is easy to subvert and completely broken in some online clips (i.e. people take their hands off). So yes this technology kills people. But Tesla can't sell cars without a kewl, broken feature with a misleading name. A proper system would monitor a person's face for attentiveness and disengage if it wasn't there. But not Tesla. And Tesla also disengages the autopilot a split second before the crash to pretend it wasn't on at the time, to mislead the public when crashes happen.
And it will only get worse when their "robotaxi" appears. Passengers will get to enjoy plowing into hazards without any chance of stopping it because there will be no wheel or brakes. I'm sure they'll still find ways to blame passengers or they'll shove the liability onto operators foolish enough to buy one of these taxis under the delusion they'll make money.
youtube
AI Harm Incident
2025-05-13T20:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx0L1qHglX3pK9gPYd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyz4uJIB8QA94SJXJh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy7UTlST1n9Yn0_Dnl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwUgmr3H29k7l5eOFB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxsMR8mTA-DjIx5ilJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugyb7iz4khmOa1kskUZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxIUrI3kcZaz93ppHp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw8cUF7UZR4JVo9ry94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyjhfHKMc210FQq_yl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugzil9Bn_RkffBN3O-d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]