Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This guy is now motivated by guilt. He thinks government should make companies b…
ytc_UgwMSusML…
G
Capitalism is slavery.
Robots are just property and slaves.
AI are just property…
ytc_Ugzf1Swgg…
G
I noticed this months ago. I had a Samsung phone sometime over five years ago th…
ytc_Ugy-QKvfZ…
G
@barbielife5154Use AI as a learning tool, using it to do your work will only hur…
ytr_UgxhWpHor…
G
I guess all of the Silicon Valley tech-billionaire CEOs sitting in the front row…
rdc_nuhui64
G
Not true, Elon is clueless about the future....no, we won't have what we want in…
ytc_Ugxb2BM8k…
G
FAQ: Do you have more footage?
Yes, I found about 2 more seconds while going thr…
ytc_Ugx74bd-T…
G
>What will those cars do if those lines are obscured by snow cover, or will s…
rdc_d1kqphy
Comment
You spent way too much time defending AI. LLM-based chat bots should never be relied on for health or nutrition advice. They have no ability to reason, they are statistical models only. People call their mistakes "hallucinations" but they have no concept of reality to hallucinate from. The danger is not something that can be programmed around. From an RN with a computer science degree.
youtube
AI Harm Incident
2025-11-24T23:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw2JQzh7q1Roc7f4eZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxjfYg4j3ydrt3A0h94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxByghFbZK3sjV2Wzt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwY7-7DyTErMzSKxiJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzw7gGwHB3ZVazbdhN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxvk0_1YaKeiHN2zfx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzLhzFyZzLBVUyEkAh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwrvzR2kmBgIXiKVSB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxOFUpibN6qqQFwS2x4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz0hyF42eCAfD-UMD94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]