Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I use AI to write short news (fake) articles that might be plausible. Lucifer c…
ytc_UgwB8MESh…
G
Actually, I'd say you really are a luddite. Per Wikipedia: "The Luddites were me…
ytc_UgwWPBqxE…
G
We are at a time where those of us with love in our hearts and compassion in our…
ytc_UgzhD5PSO…
G
AI consumes more energy than bitcoin. Most proliferate waster of energy on the p…
ytc_Ugz5Pcjh1…
G
People need to understand that artificial intelligence is a “simulation” of huma…
ytc_UgwV7aDqH…
G
the guy was having a psychosis episode and all the comments are blaming him for …
ytc_UgzcSdIJs…
G
Creating art is a fake job but finance is even faker job. Finance is maybe the m…
ytc_UgxOFiqvu…
G
You should still learn to animate. Ai may change the job market but it cant take…
ytr_UgzW07JHa…
Comment
As a person who has studied AI a little, I can say that there is a situation called hallucination for AI. This happens when AI mixes fiction with reality since AI cannot tell the difference between good and bad, so just as much as it is helpful, it can give us information that it thinks is correct. However, it is not correct.
youtube
AI Harm Incident
2025-12-28T06:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgwyLFcu86jJuv3K_u14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugz5FjtZ-Grmd4_r1_d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},{"id":"ytc_UgxvbrCuaOPwfIze8Gl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgxaokaRZuICS3BGCxR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwK5f-b-bqGgj48B814AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},{"id":"ytc_Ugw2j8f3bXhZLmYQhTl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"approval"},{"id":"ytc_UgxknD21ok5IJ04ZbYp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_UgwUYYxOvRPXi2p7Ek54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgwDsnvnbVcK1AdoPON4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugxo_xfirLKKM-m92Sh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"})