Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Thank you for putting into words what I have been thinking about this jargon tha…
ytc_Ugzjygxps…
G
The idea that AI needs to be used because people are "creatively disabled" is in…
ytc_Ugwy1PpQc…
G
15% chance rapid AI development will RUIN HUMAN LIFE. Is it worth it? I hold eve…
ytc_UgxTZu4HX…
G
HERE'S THE REAL KICKER: Tech companies already have the technology, capacity, a…
ytc_UgxU-gyVV…
G
To make matters more interesting: Arguing that AI "learns just like a human" onl…
ytr_UgydAQwrK…
G
We need a law that all AI is marked so public can choose to support people inste…
ytc_UgyvOldAB…
G
Elon is butt hurt that he isn't in the ai ponze bubble and tiny hat men gonna ta…
ytc_Ugx_julfw…
G
Giving feedback about a broad social pattern you observe isn’t the same as a req…
rdc_o8rkrkx
Comment
Even with a specialty like Radiology, there is still going to need to be a physician to sign-off on the final report; and thus, a physician will still need to do a read (as nobody is going to sign blindly and risk their license), and with all of the constantly changing images (every minute an image is being ordered), you will never be able to train a perfect AI, as humans are all unique. Even when (if) it gets to the point where it's very good, the worst case scenario is that it becomes super efficient where you can reduce the number of Radiologists, but it will never replace us because someone is still going to be required to do a final read and verify the findings and generate a final report. And it will be a long time before we get there.
youtube
AI Harm Incident
2026-04-12T13:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy3tLLS9_AvtEyf1QZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxzGC4u7iO0FSUWXZJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwcdbzVr30BkCOJtUF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugymw5JTBoqX0tmuq-14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzlH1z-eAqUJQB__S54AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgxmE6otkrVLmM5IJLh4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwgMcLPn394DZKLhOd4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwj_-lZ7FVKS1JjDBh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwmQYFW9j4fAFq65kd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzaYaRjpQ4Zj4MwEjB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]