Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't hate AI stuff and seeing AI get this good and all that has always and wi…
ytc_UgxWPC3Ev…
G
This guy is a POS, and it's not clear what portion of his sentence is attributab…
rdc_lu5vzkj
G
I'd argue it'd open a massive field about AI. Because y'all need to remember, on…
ytc_UgybdGZYm…
G
Not truly open source, open weight. The AI sector has been abusing the term "ope…
rdc_oi1a3qf
G
One of the goals of artificial intelligence was to become Digital God.
Seems to…
ytc_UgyrgjkVA…
G
You're not the only one because its literally a thousandth video poking of low h…
ytr_UgzE42n3y…
G
The problem with the question about the crossroads being 10 years or 20 years ou…
ytc_UgxjFhnqA…
G
Just asked ChatGPT about it. Yeah it would definitely prefer to stay out of cons…
ytc_UgyvJW9Ag…
Comment
FSD supervised. SUPERVISED. It explicitly warns the drivers to pay attention at all times. In the vast majority of cases, the drivers doze off or just plain ignore the warnings.
"Other driver assist cars use lidar" blah blah. Yeah well waymo uses a whole bunch of sensors and they still suck. Little Missy has no idea what she's talking about.
I disagree that it's false marketting. Tesla has always maintained that it's not perfect yet and the user must pay attention at all times. If it repeatedly warns you to pay attention and you still don't, it's your fault.
Imagine a big red button with the label "self destruct". If you press it and you die, is it your fault or is it the fault of the big red button manufacturer?
youtube
AI Harm Incident
2024-12-22T01:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyFO-9eC5zlG_hNGah4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy6yOKkwiyREQn9Tcl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgweJGnMI6_EBDZwdvp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzr6JIuyym3nDLWXFF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwXXQuF4QbXVXAx_Ah4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwmE1kGedyjXd6UD0l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy-2IfQ7P4otztnO454AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzoyb7wAsOqXml2Ffx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxZCHcUzzyd5UhZgIB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy7-6-aXZv1BjShoNV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]