Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This self driving bullshit should be banned in all North America , if it doesn’t…
ytc_UgzF_C6bC…
G
“People who say ai art has no soul often have none themselves” the literal “evid…
ytc_Ugx3hHMSO…
G
Just as this question to chatgpt “in growing population, with all the automation…
ytc_Ugydn0WHl…
G
i had no idea that it was even possible to poison ai generators. i wanna start d…
ytc_Ugy_AhurR…
G
Interesting. I have very little understanding of AI. My only concern is that peo…
ytc_Ugxc1DGg4…
G
This isn’t really gatekeeping art.. ai is trained to steal artists work and then…
ytr_UgyV-gzDG…
G
There really was a case where the two chatbots started speaking to eachother in …
ytr_UgzDQ_WW3…
G
@henriquealves3086 Note: I did not defend nor support AI usage. I clarified some…
ytr_Ugz-BxOxH…
Comment
And how many drunk drivers commit the same offense in the US annually and don't even get their driver's license suspended?
I am not aware of any state/city/county where Tesla FSD is approved by any government agency.
I know an area in Austin TX Robotaxi is allowed to run in FSD. However, they also have a Tesla employee in the passenger seat with the ability to stop the car in its tracks if it does / starts doing something unsafe.
Tragic case for sure. Someone driving at night with "auto pilot" whatever that means on, and then drops his cell phone while cruising at 60MPH+ is liable. The technology is not approved for FSD generally, and it is the driver who is negligent. Being ill-informed about the capabilities of the "auto pilot" mode is no excuse. The driver is guilty of negligence and is at fault. How is this not "case closed?"
youtube
AI Harm Incident
2025-08-15T22:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzauBj5bcqLKs2DU8N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwLZKkNaJvovZqhGel4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwufAfCJgwjxy8pTI54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxioOvBy6zz1KCz_414AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwRD51mHqg_DC0BLaR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwuG2ieea0nLuoyH-t4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwV-wgNZBYWrlYK0FN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwmayziGuujsKHXGGN4AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwzXiNa1pHunyXL0u54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwzYv2lcAIu1lnFvLB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"}
]