Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
In the US, businesses are allowed to deny service to anyone (so long as it’s not…
rdc_ohxzikx
G
The AI was correct on all counts. Just not in the way you wanted it to be correc…
ytc_Ugxpf85Pf…
G
10:10 if you REALLY want to make art, you will give up that time to practice, an…
ytc_UgwQvNY4E…
G
Good topic to write some books and make some money in this horrible surveillance…
ytc_UgxEbC_dA…
G
Thats the worse example you couldve make
I mean cooks can use microwaves
Its ju…
ytc_Ugx4t6Gt3…
G
“Ai Arti—“ That motherfucker is the FURTHEST thing from an artist. Try getting s…
ytc_UgyJnVmRj…
G
You still have to type prompts so it doesn’t technically do it for you. So I agr…
ytc_Ugx7hBBB0…
G
To clarify. You are bringing attention to an important issue in a terrible way. …
ytc_UgwVABXq3…
Comment
This is a Tesla issue. Is tech “there” yet? As it was pointed out Elon is playing the long game trying to get the end prize and has refused to use sensors that would have recognized the danger.
Is AI ready to fully autonomously drive? No. No AI anywhere on the planet can do that yet. Not remotely close. Mercedes is the first and only to reach SAE level 3. And they’re still very far from SAE level 6.
But, the general mentality Elon is taking is what is putting people at risk. By avoiding sensors that are not purely optical he’s made the personal choice to put lives at risk. It’s clear that radar would know a solid object 60ft away is a solid object, and when it was 50ft away you’ve clearly gotten closer to that solid object and to react accordingly.
youtube
AI Harm Incident
2022-09-03T15:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgzoB34e3dW3eIJCvdZ4AaABAg.9fWoOeaBTta9fWt3Bpvn4J","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwjuTsmsROZBolE0cR4AaABAg.9fWmaF13ab19fXH1XLAYLz","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgzFE3SbNg0StvrbyTV4AaABAg.9fWk1PKZpe89fWwNb8NTMT","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugy0eeAo8kOIX2fjTGF4AaABAg.9fWiYWs-cdA9fWmjUzGH4_","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgzA282R2A5A-aamIb54AaABAg.9fWi4JAAcHG9fWmwi0H4hm","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgzHq3GUWnRqqnelf594AaABAg.9fWhXE60Wh39fWlvvKSxnp","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgzPi2YSm-w27nYD49J4AaABAg.9fWhRct_3Y_9fWrwF109OJ","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugy7S6ZT9imOvzlUgod4AaABAg.9fWhKbgaPYz9fWl6_LrLvK","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytr_UgzzcyQ5xrGmUCwi7lB4AaABAg.9fWgNxMAfZO9fWj0QQJ7Ks","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgzzcyQ5xrGmUCwi7lB4AaABAg.9fWgNxMAfZO9fWpMANEAOW","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]