Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The robot: I’m going to get an reward for this :)- other robot:oh shit I Messed …
ytc_Ugw4oy5Pa…
G
I noticed that the public library near me is having a "make your own AI-painting…
ytc_UgwhejT3v…
G
Meta and Xai will also fail to attract users because of lack of trust of the own…
rdc_mz0crfe
G
SUCHIR BALAJI. Justice for suchir. OpenAI is accountable, Sam Altman is accounta…
ytc_UgyWc4SHj…
G
I think people like you are as scary as the negative potential for AI. We're a …
ytr_UgxjgMmp7…
G
We should check to see how well AI closes all your stab wounds after exploring y…
ytc_UgxMKRCGg…
G
Plus, AI art looks so generic with the same artstyle that's so easy to spot. It …
ytc_Ugxoi0_c4…
G
This is incredible… if all the schools in America operated like this I wouldn’t …
ytc_UgwVdduXJ…
Comment
People just don't get it that just because you call it "artificial intelligence", it won't be intelligent in the way it matters. It's just a pretty decent driving automaton. It's doing one thing, and one thing only, like a screwdriver is only very good at screwing in screws, and pretty bad at everything else. Driving doesn't work that way. It requires a true general purpose AI, as you have to deal with things and situations on the road that you never before encountered. A general purpose AI, is at least 50 years away if not more, and at first, it won't fit into a car. So don't expect anything more for a while than this joke of a self driving, and driving assist, that is killing more people than it saves.
youtube
AI Harm Incident
2022-09-04T12:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxnQ9jGYwm3E4D1e3F4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzdyq7FCgl-kjhMkAJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyXRFnNj1lnkiDGiDt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyyJ7VJqi8_ljnN6dV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwORKHDZUkPsTXsPDZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz9iSIleL_wGmyCKeF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyQE_XoixBkBd1JC2R4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw0_wutpzvDw5lJB794AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyJd9itNriWyRrMVq54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxiokvASqBZ-s40QBB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"fear"}
]