Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@nae-nae-99fsd is not autopilot it’s a completely different system. Full self dr…
ytr_Ugz-4oIrt…
G
If you regulate my chat bot i could go to jail for sexting without a license?…
ytc_UgwnX6vRI…
G
@shrin210-98 you'll still need work power in every position even with or without…
ytr_Ugy5uePRl…
G
This guy is a buffoon who should have been fired for his incompetence over AI, r…
ytc_Ugw25wYIf…
G
Human drivers in the U.S. kill another human being, on the average, about every …
ytr_UgyUH039-…
G
Hey, while all of you have ignored me, I have literally the best plan for humani…
ytr_Ugxp6M5km…
G
To simplify the video. If AI takes 50 million jobs across a country from real wo…
ytc_UgyDuu1OJ…
G
What about China? you can stop AI progress in the US but are you okay with China…
ytc_UgxwdvkWn…
Comment
Tesla is comfortably the most solvent and profitable car company in the world.
They have approximately 30% margin on vehicles, which is unheard of in the automotive industry, and they have significant cash on hand, with extremely low debt.
The company is smart, and safety is an extremely high priority. Tesla cars have been recognized as the safest cars on the road by nearly every metric.
Sacrificing safety and increasing litigation exposure is not profitable. The idea that Tesla would omit safety features to increase profit is ridiculous.
It is possible Tesla made a bad decision with using only cameras for autonomous navigation, this technology is new, and is going through growing pains, that is normal.
Also, I believe Tesla has recently made the decision to reintroduce lidar in their vehicles.
youtube
AI Harm Incident
2022-09-26T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzVJfG-AVRs_IjtUR14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"concern"},
{"id":"ytc_UgyFcFo17Iz77RZ7kxN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyuHHOYbje74v-LEQ94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy44Uu_6CM87kC2D9B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxOMp39mj7AX6QaKVR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxDwB-gzJ2lsZ4jLJJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxkLA7IekFRrfXui2Z4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxigWtmNHS395_MJaN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyQESbNb8A-IGnM0zx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgyD4joqCj_i47H5kvV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]