Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Frank Herbert wrote the following line pretty early in Dune:
>#Once, men tur…
rdc_nck99nz
G
This is actually terrifying look I'm against AI nore am i with it you can be a d…
ytc_UgwDfxgF4…
G
Idiocrasy, AI threatens society, our jobs, our lives, it's dangerous! It's like …
ytc_UgwjFwuGV…
G
AI has so many uses that can help every day people, art is just not fucking one …
ytc_UgxNyugby…
G
A.I. will propel humanity across our solar system because 1) it can 2) super-int…
ytc_UgwkdMQKC…
G
Professional Driving is going to become obsolete, so crazy to see. Was just in S…
ytc_UgwbFqdf9…
G
Truth is if my parents would have let me drop out when I wanted to I’d be way ah…
ytc_UgyjxI8Dp…
G
Altman was to be ousted from OpenAI, got a warm welcome from MSFT. Monica mentio…
ytr_UgyeX8pBt…
Comment
People might think from looking at this footage that there was plenty of time for the driver to react, but owning a Tesla I can tell you that between the expectation that self-driving should know what to do, lack of signals that it doesn’t know what to do (asking to put your hand on the wheel is not a warning and happens constantly), and having about 0.5 seconds to understand that it’s doing something unexpected at 60+ MPH it’s not easy for a human to react correctly under those conditions. That’s why I’ve almost completely stopped using any autopilot/self-driving on my Tesla. If these companies are creating what they claim to be “full self-driving” systems, they should take liability for hurting and killing people with those systems.
youtube
AI Harm Incident
2023-08-16T12:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzf8LeOqOXCLzU7b7d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwBraNxQ_hzudtwdAJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwrz0RGL2BWgehXlfF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzI14_obJnnGNpQayd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwQjYDnES4dzzrb2Hl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyB9ZWHDJ9dCOR1Qw94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzuuRwUwAgde9Wp7GV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyke8Ls50zBrtmfq3V4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyB9zM4KNdxKAulNVl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzhHNHNaASuPr9h0L94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}
]