Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You have to use night shade not just screen shot something that used it
Oh…
ytr_UgwaRB6RR…
G
Where would your argument be, though, if we started making models fully trained …
ytc_UgxCjr21R…
G
Terrible for the family, but I don’t understand how someone could think using a …
rdc_nnjnfk1
G
2030? No worries mates, The other AI experts say that AGI will wipe us all out b…
ytc_UgykAVuFL…
G
It’s more of the fact that they got an ai to do it. Robots can’t just look at so…
rdc_i6s6o1y
G
I wonder what A.I. itself would deduce from this? Would it say the data centers …
ytc_Ugzlotdil…
G
“ai can do things you never thought possible!” it can literally only do things f…
ytc_Ugw4rI2FK…
G
> In the NYT today, Cade Metz implies that I left Google so that I could crit…
rdc_jife4cs
Comment
I have been commuting in my Tesla for 6+ years now, with Autopilot enabled probably 80-90% of the time (I have FSD, but leave it disabled most of the time, because it is too unpredictable). I work in tech, and am fully aware of these shortcomings. But the way our brains are wired, they are bound to make simplifying shortcuts: if AP works fine 99% of the time, there's no way my attention won't occasionally wander, even though I know from experience that the car will actively try to kill me that last 1% of the time. On top of that, Level 2 autonomy takes you mostly "out of the loop" even if you are paying attention - and it takes precious extra seconds to take charge again, even if you identify the threat before AP does. I bought an AI-free car this year, just to re-train my brain to be in charge again.
youtube
AI Harm Incident
2024-12-21T17:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyFO-9eC5zlG_hNGah4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy6yOKkwiyREQn9Tcl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgweJGnMI6_EBDZwdvp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzr6JIuyym3nDLWXFF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwXXQuF4QbXVXAx_Ah4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwmE1kGedyjXd6UD0l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy-2IfQ7P4otztnO454AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzoyb7wAsOqXml2Ffx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxZCHcUzzyd5UhZgIB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy7-6-aXZv1BjShoNV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]