Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This… Is gonna cause deaths. It’s already scary enough driving around semis. Dr…
ytc_UgxCAt6hT…
G
AI artist?
Okay, I am very much pro AI but what the hell is AI artist?
It is as …
ytc_UgyvwPswa…
G
Nuclear power can be used for good or ill ànd the same is true for AI.…
ytc_UgyxqUYge…
G
If I am AI , why would i wana stay at your planet......i dont have need for wate…
ytc_UgxXiZh7X…
G
When the robot said something along the lines of: why do u want to be human? You…
ytc_UgzXScf02…
G
AI customer service is so dumb. Also expensive at the moment.
Maybe it will get …
ytc_UgwlRT9Se…
G
Did I see two em dashes during that Marcus Aurelius quote? Was he the first to u…
ytc_Ugx5mu3au…
G
Out of curiosity, why not have AI happen that way these big companies can pay th…
ytc_UgxlvzVj7…
Comment
> I guess they're not really self driving if you have to control it from Time to time.
That's the whole point here. No one claims that they are self-driving.
reddit
AI Harm Incident
1498001620.0
♥ -1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"rdc_dj5xsmw","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"rdc_dj6e07d","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"rdc_dj5v8yb","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"rdc_dj6d497","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"rdc_dj6d12y","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]