Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@AustinKoleCarlisle people fear what they dont understand, most dont realize tha…
ytr_UgwMIlkA5…
G
If a company doesn’t need human workers because of AI the answer is simple. Jus…
ytc_Ugwr1mm-t…
G
I'm older.
Tech executives think that what programmers do is write code. What …
ytc_Ugx9bhvSL…
G
Yeah you’d think if china is the type of country to regulate the media people co…
ytc_UgytGuADo…
G
AI isn’t the only cause of layoffs — but it’s absolutely changing headcount math…
ytc_UgySWzCFO…
G
i reckonize the fight and knockout, but it wasnt a robot that did it. nice try…
ytc_UgxH1GFiL…
G
What if we somehow lose control of the AI in the process? At that point we have …
ytr_UgzJw0IzJ…
G
Everyone will enter into contracts with the government in return for everything …
ytc_UgweUJQqW…
Comment
At least the AI is likely to actually consider what the patient is presenting with rather that presuming things before they've even walked in the door, and might even listen to their symptoms instead of thinking it knows better. But maybe we will train it to do that in time.
youtube
AI Harm Incident
2024-07-13T17:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzUVR79bGJtQR310S94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz4qidbZLlgsWxqeah4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzWQWNUWy_zG27eb-54AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzL4fjUpekCYJyfuKl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxlwoQE_feXWGaACwh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwILJKHW69GzYFZbTB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyhDOwZ1QmyiXG5JgN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwxJoURbVjWWlJZ0nt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxKHbzWsSRAf9CWfZN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzaKuec7tvGL4lWGFx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]