Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"You wouldn't be this critical if I didn't use AI!" He's right about that. If an…
ytc_UgxOvkoVS…
G
And, guess what?
Boys and girls, 2001, ...was a movie. It dealt in sci-fi. Peopl…
ytr_Ugz8AVDWX…
G
If this were a civ game, this would be the worst news in thousands of years.…
rdc_jfaxic3
G
I think the problem is not so much AI but out of touch elites and politicians ma…
ytc_UgzM8N03e…
G
I'm firmly in the camp that believes art isn't some mystical mumbo-jumbo. I beli…
ytc_UgxYiKV13…
G
i’m a landscaper and although i’m scared about AI , the human touch needed to we…
ytc_UgyH-Vfiz…
G
Instead of robot rights or animal rights, shouldn't we just make law like this.
…
ytc_UgwUHuTfY…
G
The AI recognizes patterns and is correct, but because the patterns do not align…
ytc_UgxLNmViW…
Comment
My two reactions:
1) I use FSD like I use any other tool, extremely carefully and with a constant thought of "how could this go wrong next?"
2) Motorcycle deaths per year fluctuate between 3,000 and 5,000 since the 1970's, with 50,000 to 100,000 injuries per year. I think it is an important question to ask, without discounting the tragedy of these two deaths, why are so many focused on these two? Because the car was doing the driving?
What about the unseen hundreds of thousands or millions of accidents that didn't happen because of FSD?
With or without FSD (or even a future, better AI system), the owner/driver of the car MUST be responsible for the accident, just as in any other case. A car is a tool - the wielder must be held responsible and not the toolmaker.
youtube
AI Harm Incident
2022-09-26T14:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzVJfG-AVRs_IjtUR14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"concern"},
{"id":"ytc_UgyFcFo17Iz77RZ7kxN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyuHHOYbje74v-LEQ94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy44Uu_6CM87kC2D9B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxOMp39mj7AX6QaKVR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxDwB-gzJ2lsZ4jLJJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxkLA7IekFRrfXui2Z4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxigWtmNHS395_MJaN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyQESbNb8A-IGnM0zx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgyD4joqCj_i47H5kvV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]