Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I mean, I feel like this is already going on in it's own way. A LOT of physicians complain about lack of autonomy in modern-day practice. They have certain flowsheet-like algorithms that hospitals require docs to treat based off of per pt complain/S&S. We already actively use AI (in bigger, busier hospitals) to do our chart reviews and flag patterns, like SIRS criteria for sepsis for example. But we need full-fledged robotics before we can literally replace doctors, let alone nurses, and I don't see anything close to that yet. I think what's more likely in our lifetime is employing fewer doctors who give 'oversight' and assume liability for 10x the number of patients while AI does all the primary data analysis. And everything that comes with that. I absolutely cannot see nurses being replaced in my lifetime, cuz robotics. Even once they make a robot advanced enough to do all the physical tasks of a nurse, early models will be very, very expensive. Way more expensive than a human. You'd also have to take into consideration patients tampering with/damaging the robots. Again, more expensive to replace than humans. And finally, I could even see the government stepping in to protect human jobs if robots & AI get that advanced. Because every job type will be impacted by that point, and robots don't pay taxes.
youtube AI Harm Incident 2025-11-14T09:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgweJ5UfZsd07cUWpQV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzckIEGHI3LPenEd7p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugzr_yadTZSL9OMtdpx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzgtyaUjpNpQkUbisd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugy0U1h4qcmkx3414lN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgylY0pzxE43eyLIufl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_Ugx_hjwPkhKGfNYzVdl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyffWcTdg_zsJ7pIAB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxnMAdjuSeBqIoE7X94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgypRT-16W-_CrMzDbl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]