Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Exactly in besides they don’t need to sell anything to you either
Think about i…
ytc_UgzD-HI_9…
G
The same thing happened with missile technology and atomic weapons, the inventor…
ytc_Ugy4mNQah…
G
The taillight thing is interesting... I have a DR650, and it's taillight is eye-…
ytc_Ugy-fViY6…
G
LLMs do not have concious awareness, or emotions, they cannot therefore feel any…
ytc_UgwAVhXkf…
G
@peterclarke7240 Yes it will, but my point is it will take middle class office j…
ytr_UgxgDTVW9…
G
Even if it wasn't blatantly unconstitutional, the algorithm is terrible. It crea…
ytc_UgwV06Bnf…
G
I think I generally agree with your point, but I'd push you a bit to say it's cl…
ytr_UgxnO6auS…
G
Also in the future to avoid this it might not be a bad idea to use a screen reco…
rdc_kgp3ctb
Comment
"That's a human problem, not an AI problem"
It's an AI problem when AI is marketed as a superbrilliant program that can replace asking a trained human medical professional. All the AI bros have been pushing this narrative. I see more and more ads telling people to ditch a real therapist and talk to "an AI clone" instead.
youtube
AI Harm Incident
2026-01-02T08:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwiBDQqxOF1FadpFG54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx8By4MI4NkIY5MmZ14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz3OeihUSlGnYlOydZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugws0V8LuXIuA2PJAa54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzxpB9xgEpy0ahHxuB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzxFNTm3YuVwjPZmkp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxd2TgaHqKEzxiWvxt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxIu_VLWHFlq3hwR4h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwyE5oyvqRfwvtOEaZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzTZZlzHrRBaNLE2vR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}
]