Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The only reason you buy a chair is because you don't know how to build one, AI k…
ytc_UgxvztZyt…
G
Those keywords are really close to a lot of the things that it's been bringing u…
rdc_mulfwm6
G
@MTSP-20245 lmao, no kidding!? Plus, if the first generation of AI went rogue on…
ytr_UgzPoAx2M…
G
If you want to have an LLM talk about whether it's conscious, I suggest you use …
ytc_UgyE_cLrV…
G
@nicklang7670 im not saying we should rely on it. Its a tool, only as capable as…
ytr_Ugz-x95Nd…
G
Stop bc the first time I used it I used it for 12hrs straight erm…
Don’t get me…
ytc_UgxeS20KM…
G
Nuclear is obviously cleaner than things like fossil fuels but I don't really se…
rdc_eudkjqq
G
People who ask if AI can be conscious are, in my opinion, unqualified to talk ab…
ytc_UgyL86HTO…
Comment
WSJ always dig up the same content on this topic , because people love to talk about Tesla.
even now i am still confused if WSJ means autopilot or FSD.
and if the accident enabled autopilot? or enabled FWD? or it failed for the automatic emergency break?
was the driver paying attention to the road (can be seen from hacked interior camera) , would a human being able to stop and react better than the system? and then why he did not?
and how many accidents a year for tesla? how much more than non tesla vehicle on star ?
how many suspected caused by failed AEB, how many caused by with Autopilot enabled, how many accidents caused during FSD? how may of them the human reacted by tesla did not give control? how many human overrides and caused the issue?
WSJ has related content for few years but never report these hard fact and statistics. why?
youtube
AI Harm Incident
2024-12-14T17:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwO1DwBUKrXUTVCJLB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyIxKaWNvuje9mLEnR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw38bHPL9gHZuyZFRh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugzlcdppz8gsJEhC4T94AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwgNe4trca77ldco1p4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgxtfYY_ZuT0_3OAidd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw0TXVAD76SmnkTqxp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwpC-0bLmiosdzj76p4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx-6dulXef4tjTkbxd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyaBb03d1K8bS2xOjh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]