Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Plus, there's a lot more artistry and intention in photography vs AI illustratio…
ytc_UgxZm3tNT…
G
The corporate boards elites need to read up on French history in the 1780s-90s. …
ytc_Ugz4Jdy-V…
G
Sentience is not why AI is dangerous.. PEOPLE are why it is dangerous. Sentience…
ytc_UgzMaiUra…
G
How does this dumb ahh lady call an ai bot racist it’s not even a person nor did…
ytc_UgxaJJG8r…
G
Ai art developers have never been interested in making a computer that creates a…
ytc_UgyLIc_wc…
G
Please stop having children create AI buddies. That’s like introducing minors to…
ytr_Ugyeq4ULs…
G
Does the AI correct students misconceptions or guide them in the thought process…
ytc_Ugzb3UHvH…
G
People made mistake... AI makes terrible... Ek subidha ke andar bahut badi asubi…
ytc_UgzCAyWiw…
Comment
"Lord, what fools these mortals be!" (Puck, A Midsummer Night's Dream, Act III, Scene 2). The biggest danger from branded AI is the depth of human gullibility. Companies used to blame their computers for billing mistakes. Then they found out that their customers had come to believe that computers were smarter than people. That's when they started pointing to the computer billing information as definitively and irrefutably correct thus given them leverage to bilk their customers. No, none of our machines are in the least bit sentient, and sometimes I have questions about the humans.
youtube
AI Moral Status
2025-07-09T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxgTUaiJcXV1_JqMJp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy0mYUg6At8jxWYtPR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyD9VKr5vlM6b2TprR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgylzOrzTy4W5UQa3S94AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzgwLqyzP1xWneNZT94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz-io3y9N4UXo0SKqB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugyjvi62RD3KgSDZjTZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw_J9FI541TeWG3X8x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz9B2rVkOThidpghKN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz3-tV33xDiCF4GXG94AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"mixed"}
]