Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I first encountered this technology 14 years ago. I assume now that it was advan…
ytc_Ugz0EvGg7…
G
If nobody has jobs to earn money, they won't have money to give companies for go…
ytc_UgxfrpYmr…
G
Human control is arguably the most dangerous aspect of an autonomous weapon syst…
ytr_UgyXvOkou…
G
AI art is boring because they don't use it as what it is, a tool, it's like buyi…
ytc_UgwncymAb…
G
Remember when they used all the ring cameras in the country to find a lost dog? …
ytc_Ugz0FekmZ…
G
saw the new mission impossible opening day and this was legit the whole entire p…
ytc_Ugzd5zzMZ…
G
Nothing like an AI elaborated video about the agentic AI crisis lmao the interne…
ytc_Ugwfwfl_J…
G
@antiricergtJust because someone elaborates on numbers so that others can interp…
ytr_UgzDR-pPW…
Comment
The main thing you arnt taking into account is healthcare is incredibly elastic. Certain specialties/procedures. if physicians were able to be more efficient/effective, there would simply be more of them done total and access would increase. I see this as the biggest boon. For spinal injections for eg., maybe you have a robot and you can supervise more. Well, now more people can get PRP injections which used to be cost prohibitive for most.
People want to see their doctor more, but there just isnt enough time. Maybe now doctors can spend like actually an hour with each patient and have AI handle the brute workload and you handle the conversation or human/empathy/experience part of medicine? Just ideas.
youtube
AI Harm Incident
2025-07-26T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugz0-RjtEWg2BWyBps14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz4LIYwin5AXIGmpIV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxJFBOF2EMZCVajzyZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxFGyumZwQr7Mka8mZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzCTXn6thQKnG4F73x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyhGvJJjiYpYTCtUwt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy3A8ODmeTAQakpP714AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxaREvezWl1f2gXfbp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxH_SkOZBqFQQHQZId4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgynJvqt_RvlSHlR-7l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}]