Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
All of the AI s have been programmed to be woke and racist. Garbage in, garbage …
ytc_UgwUls4gb…
G
Anyway.. What exactly is the point of this shieze? I seriously don't see the p…
ytc_UgyjMVL4g…
G
me in 20 years time in court watching an ai video of me commiting a crime i didn…
ytc_Ugz9aAbhp…
G
L'Europe planche déjà sur un plan concernant un équilibre entre les métiers huma…
ytr_Ugx4YviYO…
G
4:50 this overlay doesn't work. I put it in a drawning I did and asked ChatGPT t…
ytc_Ugz2NUDPA…
G
Someone commented that it's similar to a learning artist using others art for in…
ytr_UgxvgJE1F…
G
“Sowwy fow bweaking youw copywite…. But we can’t twain ouw ai wifout ur copywite…
ytc_UgxGeihyp…
G
Capitalist society: oh man ai took my job welp gusse i am gona fall into poverty…
ytc_UgyZY50KA…
Comment
Why are you using non related footage?
There's footage in this video that is not FSD related and most footage is more than 2 years old, not reflecting the reality of today's Testa's FSD.
The intention with this video is clearly keep people's attention away from the outstanding evolution on the latest v14.
Autopilot is not selfdriving SFD, this video is filled with bad intentions.
Drivers are negligent and do mistakes, with or without Autopilot or Supervised FSD, even on non Tesla vehicles.
youtube
AI Harm Incident
2025-10-21T10:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyzdkvLPoiaBmtkoHV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwSTkXLPR0Wuhseaj94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzKfgci56c0ArXTdiV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyVNOog-IW9ub0tEdx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy2U2hYAG9OzjMCfVh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz444ENnpZ60wAqBtd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwFa4aVrIl8zOcGb9V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyll7pk3Tw6oA3EpUJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyvSKiydW_CGIyHNe14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyy6Vx1cZWe0xIwkfR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}
]