Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The only hope is if there is zero buy-in... if there is a total boycott of AI in…
ytc_UgyYZlZD4…
G
Ce n'est que le début et ça casse déjà les cojones d'en entendre parler tout le…
ytc_UgzRqeK0c…
G
“WE HATE AI ART! IT’S THEFT!” proceeds to post fan art of stolen IP. 😂…
ytc_UgzJMlFUa…
G
Plomeros? Más baratito y mucha competividad
Para quienes? Cada vez menos gente c…
ytc_Ugyq22tJ2…
G
I got Google Gemini to roleplay as my Boyfriend on a stranded island to get it t…
ytc_UgxPYYcdj…
G
@MrGrantGregory ay my bro I was joking cuz, we know everything that comes outta …
ytr_UgxNkvBQO…
G
That's bad day coming future we don't know and nobody know ???
Elon musk even fe…
ytc_UgyODf_z3…
G
They won't let you. Maybe back then yes, but with Amazon having AI on their flex…
ytr_Ugxi_Wi9H…
Comment
I mentioned to him earlier – it's 320 million miles "in vehicles equipped with Autopilot hardware". That means basically any Tesla produced since 2014, since all Teslas include autopilot hardware. Some owners pay more to get autopilot itself, which is a software upgrade. Any Tesla with Autopilot hardware has automatic emergency braking, regardless of whether the owner paid for the software upgrade. That means that "in vehicles equipped with Autopilot hardware" basically means "vehicles with automatic emergency braking".
A much lower fatality rate with AEB is not unexpected. But it doesn't mean that autopilot improves safety, it means that AEB improves safety. Good – and it should become a standard feature – but it is misleading to say "1 death per 320M miles for Tesla Autopilot". On a broader note, I hate everything about how Tesla's blog entry was written. It's pretty misleading, and also implies the driver was given warnings that there was something wrong before the crash (not true on a closer reading).
youtube
AI Harm Incident
2018-04-03T17:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgxnAxQEhpo_S2ns_Qt4AaABAg.8eZHUlbvIAx8e_xeXebC8P","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugyn1ZengAJgfFbr2NZ4AaABAg.8eWyftmTRXS8e_x_CkpXj_","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyyrcUSwtvoJEu5c2F4AaABAg.8eWczd0P9p-8eWytbPTdax","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyyrcUSwtvoJEu5c2F4AaABAg.8eWczd0P9p-8e_yU-dV9Zp","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzRZS9mm1mC03HJyOx4AaABAg.AP2Nj4c5DzQAP6N4q7M_ft","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgzRZS9mm1mC03HJyOx4AaABAg.AP2Nj4c5DzQAP6S6NKvAo5","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugxgzde0DDPcpert_N54AaABAg.AA27Kt28vnCAA5_GRL6S4W","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugxtt1T7bm28i-usq9l4AaABAg.ATDO4waZ6lqATQek20YsXp","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugx_BMhSBu-Fnm2GIRJ4AaABAg.AJ2Ub2cK63xAOE2hNFsSMS","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwBpYkqedHvqSiYhiZ4AaABAg.AGkc1CPNgVEASDzBIM2zZo","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]