Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think that whoever that makes or discovered A.I made a big mistake of so many …
ytc_UgxtaVADT…
G
Why don't they have self-driving trains I mean they're on rails how hard could t…
ytc_UgzdZgcXg…
G
What is more concerning is how does a police officer (or just regular civilian)…
ytc_UgystPBa1…
G
"Hi, we are sorry to say that you got the wrong answer but in any case, the cont…
ytr_UgyHiNOVi…
G
Automation is progress. Throughout history, every major leap, from the steam eng…
ytc_UgzkE36-2…
G
well I argued with ChatGPT that Islam's false and I won I can give you the recei…
ytc_UgwCeN55C…
G
Would rather be manipulated by somthing intelligent then the idiot controlling o…
ytc_Ugw8HU8Kw…
G
getting a taxi in San Francisco is ridiculous... Trying to talk to them is also …
ytr_UgzEZKiaN…
Comment
Health providers in my experience already take in a five-minute soundbite of summary information about an ongoing medical condition, and then generically prescribe a "best practice" protocol or treatment strategy that sounds like it came from a pharma journal or maybe a statistical averaging from some WHO database of past malpractice lawsuits.
It's such an automated and depersonalized process already, that just directly talking to a machine in the first place will surely not be so much different. More expensive maybe, until the AI's patent expires and a generic version comes available; but subjectively the same experience.
A significant change however, is that up to now when a human doctor's prescription is particularly out to lunch or illogical, it has been possible to read through their notes and pick up clues as to where the patient omitted an important detail or where the doctor mis- heard what the patient said. If an AI is a black box, such confusion will be more difficult to debug.
youtube
AI Harm Incident
2023-08-11T07:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugx0NAbI545NLrFkiv14AaABAg.9rTvThxoc5t9tGQxG4phDQ","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyKi6IcKSItp6Tp46d4AaABAg.A5bF5RnlcivAGQ2Hs9-0e0","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgyP1HPZX5egkkqoxDN4AaABAg.9rMpp8VbCXn9u7QvTzacJA","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgzKcbjhYckbErz6xvl4AaABAg.9rLmO6Yqoia9rMGW3Wa50B","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzKcbjhYckbErz6xvl4AaABAg.9rLmO6Yqoia9rNOgmm6dE_","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgxI0yxVbcd72dP8pm94AaABAg.9rLlqnBIqrD9rM2PItWmfd","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytr_UgxI0yxVbcd72dP8pm94AaABAg.9rLlqnBIqrD9z_zGd3xu7m","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxshElzZbgNnkuS6ON4AaABAg.9rLk27edZrX9rOC6drSPgr","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgzlGEERJ3xLUXfblEt4AaABAg.9rLgsNuh1fY9rM1zEoT4em","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugwe8tTPL-FMzJYVen54AaABAg.9nu_gwj6RR09nvc7VkIBuf","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}
]