Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That could just be DeviantArt for you lol
Not that I don't agree of course (I a…
ytr_Ugxqc0u3a…
G
delamain ahh ai was like "don't worry I gotchu bro, ain't no standoff gonna slow…
ytc_Ugy37m2IQ…
G
if ai emotions arent 'real', as they are just simulated as opposed to humans, wh…
ytc_UgwLIXAE6…
G
Maybe the reason nobody wants to legislate deepfakes is because the people argui…
ytc_UgzMy-tbV…
G
Remember how they were able to identify the two brothers in the Boston Marathon?…
ytr_UgzkoZmbV…
G
I think that the creative commons should also add a license about the uses of AI…
ytc_Ugy5bi0Xo…
G
I mean, AI is the first artistic/"artistic" (however you wanna say that word) te…
ytr_UgxGfRJ6u…
G
We're basing what a.i. will want based on our own greed and fear. A.i. will prob…
ytc_UgwwSi0ya…
Comment
The issue I know of with diagnostic AI is that they are a "black box" program—it's given starting data to train on as well as a list of possible diagnoses, then finds an algorithm that best matches the known data to the possible diagnoses. It isn't initially known how each algorithm was built, however, and they aren't necessarily logical—for instance, a medical AI designed to spot lung cancer might do so more accurately than human doctors, but it might be making an assumption based on what machine & facility the lung scans were taken from, which is not how you actually diagnose lung cancer and would exclude many patients with lung cancer who got scans from elsewhere.
Mathematically these things work, but in reality they are not able to make a proper diagnosis.
youtube
AI Harm Incident
2024-09-04T12:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxRm8yl2w838IHNVX54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxsTpZZk4pTD404UDB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwiHQ38loZQkDCmcp54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz39d2s3mdz5BBoUWl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyE9j2bh1MTqD62Glx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy2EHjAxJgau4O8G2p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzhhwBG5Zvi_WraASV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx0sKkHQBaT9aUtbP14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy0QW4igRK970lTN9t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz2XmXuhSxBTRk_0iV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"}
]