Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
personally i’m a huge fan of ai art and think artists whining about it are just …
ytc_UgxGkNB1V…
G
One way or another it's going to be capitalism vs the people. The non-working cl…
ytc_UgwmOMfCq…
G
Seed 40 last : Mirrorborn Labs was founded on a simple belief:
Our future with A…
ytc_UgzfHJq5V…
G
If the locals found a market they also would have killed them. All races are as …
rdc_dv62y2r
G
Great observation! Sophia's response was more about the meaning of her name than…
ytr_UgxBtk5RA…
G
You do realize that, publishing this video online makes it infinitely easy for t…
ytc_UgwyA4uUs…
G
@AK-gh7mc I was afraid of it when Israeli said look at this picture of Hamas but…
ytr_Ugzrv92r3…
G
The illustration of guns is very, very apt. To take it a step further -- even Se…
ytc_UgzBq0WM0…
Comment
I ride a scooter but as you can see from many of the comments, this doesn't have to do with Tesla, this is the driver's responsibility, AI or not. There are fires with gas cars and there are many accidents with gas cars but it seems when these occur in a Tesla, it's somehow different, as if the car should be perfectly correcting any human error, I don't get it.
Moreover, when using the Autopilot in a Tesla, the car reminds the driver to put their hands on the wheel, and if ignored over 4 times, the person can't use the Auto Pilot anymore. When someone is involved in an accident they usually try to see how they are not responsible, in this case, they blame the car's AutoPilot system even if it wasn't actually active, they just hope no one will find out.
youtube
AI Harm Incident
2022-09-03T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxX-Ayn7yUJznfc0t54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx-PlPlFxZMlrKl7L14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzvDnbzXfa5FBoo7Zl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwgOExhv5yZa_XdtpB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyoednUGu41_yo24md4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwANF7DrFOhKxikA-R4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz8e0l4Nu2l9bYutH94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwxG7tG4EnJHJkTtht4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzErz4Be_uJzuUAsAh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzncrNhH5J6ZBBj3N94AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"indifference"}
]