Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
LLMs are flat files. If you open one up in notepad, you will see that it is a f…
ytc_UgxhL36Pq…
G
AI will never recreate anything ghibli precisely because it is too perfect, that…
ytc_UgyvhreSK…
G
Wow, so bank robbers now can just call Waymo robotaxi to pick them up and get aw…
ytc_UgwZbjpYE…
G
Where else do you think the lowest IQ users of generative AI are going to post?…
rdc_oi32vi4
G
In 2025, 99% of the population has almost 0 understanding of what a Super Genera…
ytc_UgyQf0f2i…
G
Dude, this is scary even if it happened six years ago tech technologies, advanci…
ytc_Ugzy6U4Nz…
G
This is totally missing the point, conflating what should be categorized as cons…
ytc_UggNRcK6J…
G
Did nobody realise that it could've been seperate audio for the video and it cou…
ytc_UgxAxGyqB…
Comment
Navid talks about the stability of artificial intelligence and the potential to improve care for patients. While I agree that AI can be a game changer, it could improve diagnosing and care in a lot of ways. AI will be consistent, it won’t miss things that a human will because AI doesn’t have a bad day, they aren’t affected by a patient load, they aren’t worried about 40 patients at the same time. I don’t believe that AI will drastically improve healthcare, however, I believe that it could be damaging. When discussing AI there is always one thing that is left out, the human touch. Doctors care about their patients, they dedicate their lives to learning exactly how to help them, and if they don’t, they’ve learned how to learn so that they can help them. While AI does learn and grow, they don’t have a personal connection, desire for their well-being, and an emotional connection with anyone. This is what drives physicians; nobody goes into medicine for the money or for the job itself. Yes, the money can be good, but $400k of debt to pay off to become a doctor eats up so much of it. Most doctors don’t have a typical nine to five job, they don’t go home until all the patients have been seen, the charting has been done, and the staff has gone home. If an emergency comes up, they don’t get to go home until it’s taken care of. So, why do doctors go into medicine? To help people. Every doctor is there because they genuinely care about the person they are seeing; this isn’t something that AI can ever do. Care and passion can go a long way as well, when you are passionate about something, there’s nothing you won’t do to achieve what you are after, you won’t stop working towards it until you’ve accomplished what you set out to do. If a doctor can’t figure out what’s going on, he’s going to dedicate all the time he has to figure out what to do or what is going on. That’s why AI can never replace a healthcare worker. AI doesn’t know that a patient has a wife and kids, or grandkids they care for, or foster kids they have taken in, but a doctor does. Doctors live by the principle of beneficence, to do good, and that’s something AI doesn’t understand. Now, a doctor could utilize AI to help them come to a conclusion or find the answer to a question, there are ways to take advantage of technology while still taking advantage of human care.
youtube
AI Harm Incident
2023-04-15T02:2…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzLZDICQoncahhls0F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz_1JJeK8TzMzkjy6t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwEGDKxxu1yWrFQVnd4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyLP9muwFMbN2nQu2t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyXhvWiHUq0OTWc-0N4AaABAg","responsibility":"clinicians","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgzSPzDcK6PFdJ3Oojl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy_HBzp_P0JVbolLNV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwMhwgSlKwTbgBuVrZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzQJf9HJVirqehJ_IF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyYEZm5B8_kno6PlCB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"})