Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The people running massive companies that are jumping on AI without a plan on ho…
ytc_UgxpcPNLe…
G
Sue the police, sue the hotel and sue the fukking AI BS company for millions…
ytc_UgzyBHG4y…
G
My order at Taco Bell was taken by an AI bot. The food at a hotel restaurant was…
ytc_Ugxwz98KA…
G
I prefer real art to ai pics because of pain. Prompter just typed a few words, b…
ytc_UgwJeMCrS…
G
There is no way they are going to agree on a ban on AI. The studios are saving …
ytc_Ugxg9PqpH…
G
The Democratic party wants the AI to have voting rights.
They see AI as another …
ytc_Ugw0N6by9…
G
"KILL IT NOW💀"
"KILL IT NOW I DONT WANNA BE 🤷"
"DESTROYED BY A FREAKING ROBOT👁👄👁…
ytc_UgyFZNrxc…
G
Human greed knows no bounds. We will literally extinct our selves so few can get…
ytc_UgyTKVfAU…
Comment
I asked a couple of chatbots about this. Iirc, chatgpt didn't care about please but (then) Bing did. Or the reverse? They are looking out for signs of bad intentions so politeness could help defer massive guardrails being erected.
youtube
AI Moral Status
2025-04-30T05:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzsOW_U79JUE8Blk594AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz5Nmo45qwidvyVWKh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwxttUqa66aKxgT7HN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxiVWDc9zj0WfYSoL94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxPCcuG1LqcxkaUYMt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw0HL0G63bC6wwlV3J4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwvXKhbRRK0k92q1YV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugxukz_of0dO6uYON1Z4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzMgBr-6BfrPPwIwKt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz1zEbAsIfAozkp81R4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"approval"}
]