Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Nice video, love the theo-crafting :)!
I think the big step we have to do first…
ytc_UgzinrD6h…
G
SOMETHING AMAZING JUST HAPPENED:
IBIS PAINT X ADDED A FETURE WHERE YOU CAN ADD A…
ytc_UgwNcoyrR…
G
An excerpt from my purely fictional story... hope you like it, humans:
At a cert…
rdc_gd83aan
G
I had a Facebook notification yesterday, talking about the new meta AI that doe…
ytr_Ugx07iij1…
G
As a nurse, I still can't fathom how healthcare would be automated, like yeah, a…
ytc_UgzHFAXAN…
G
You're giving it way too much credit. This biggest danger of AI is what's going …
ytc_UgxLSO2xb…
G
It is an inevitability that Ai and automation will eliminate the majority of job…
ytc_UgxXnHbzk…
G
All these “real” artists complaining about their jobs and lack of earning potent…
ytc_UgzZXPC6f…
Comment
It's obvious that Teslas are crashing more than other cars because "Full Self Driving" isn't actually full self driving.
It does all the easy stuff that anyone can do, steer, brake etc. mostly OK but not always, but can't handle the unexpected. Tesla drivers are trusting it too much, hesitating to take over or just not paying attention like they're supposed to, and the one or two second delay while they hesitate expecting the car to act is all the difference it takes for a crash rather than an avoided accident.
Full Self Driving should have been called Supervised Self Driving, but that's not going to sell cars nearly as well, so Mush isn't going to do that climb down.
Personally I'm not sure any automated computerized system can replace an alert and capable human driver, but then not all drivers are alert and capable to start with.
youtube
AI Harm Incident
2024-12-14T16:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxpRsblCJgUF-zZB694AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzbn6eCZadbNc7NPLR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy8lpRGwVULViXHTER4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxSFxHRfPjHtXQxqtR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwfR0qh66I8a_9Mol14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwT7Chwkbscy5MrlbR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzbE8T3A45jLeDFDSJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwGFJHhXZm-rzi9g1h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwHTQMTF8fFpAzE1yd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzNnt2pzbeNkrpnst14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}
]