Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI needs to be cheaper than the cheapest communist or terrorist countries. The …
ytc_UgzVQdg8i…
G
I'm not a fan of data centers but jump from 97 DB to 140 DB is actually huge. It…
ytc_UgxMBLsBA…
G
To be honest, most information that people you’re forced to interact with daily …
rdc_nlzmuta
G
Americans need to pay attention! We have a lot of AI taking place here already.…
ytc_UgzGsfeD5…
G
One thing to provide advanced decisions and another thing to provide the decisio…
ytc_UgxO-MTeI…
G
I was just reading some history on glass bottle making. It's interesting because…
ytc_UgxNklEZU…
G
They are targeting all the wrong jobs with AI. Should just be a tool to replace …
ytc_UgxWZJB5J…
G
Aw geez, I feel like there’s already a ton of AI-generated images hidden in my s…
rdc_n3x6wt2
Comment
Another fantastic piece of Journalism. On a very important subject matter aswell. Perfectly executed, well argumented and easy to understand.
Also, why was the term "autopilot" not marketed before, because of false advertising. "Advanced driver assist" is what it comes down to, but barely. Humans make mistakes, a lot. Computers dont make mistakes, they follow programs. The lack of proper information for decision making is the cause of error for both humans and computers. AI is still only following block diagrams making mostly yes/no decisions. It's scary stuff how much it gets wrong, and does not have the issue of being tired or intoxicated or otherwise impaired. Just lack of comparative data.
youtube
AI Harm Incident
2022-09-04T07:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwihHtRtipqExayxmp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxeMiGEGA4wlbEQZFR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzyFEqzRE-YZA1JCGJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz5uzt3tMk4_ujEk6l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwnQSDmqVprrxQ_d214AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugze2053bj_OrYlRSgF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwaByERyBw9mKNpFQZ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgznQHRyv0bFZ2H3DzF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxIG76mvw5gt-huRoN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwzPGpyVPpLylD9gjN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}
]