Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai is dangerous. Here is ChatGPT's plan :
"2028: Dominance - Al controls key s…
ytc_UgyQc3pey…
G
You have a very valid point, as it is definitely very unethical to obtain other …
ytr_UgwEPDhgm…
G
I just got an ad for an AI video generator as if YouTube wants me to be on the s…
ytc_UgwWplnpi…
G
Consumer Protection: States are enacting laws (e.g., California, Colorado) to gi…
ytc_Ugz7EJoXc…
G
Using a brush to draw something doesn’t automatically make you an artist. I feel…
ytc_UgxYkMA3e…
G
That's my hope too. Ai can help humanity create utopia for ALL...if it weren't f…
ytr_UgxMC8zNu…
G
Don't beg for job and don't buy any product from Amazon then automatically doubl…
ytc_UgyU-k_wY…
G
It would’ve been easy to say computers would’ve made white collar jobs less valu…
ytr_Ugzut7rAu…
Comment
We need to start a petition (if it doesn't exist already) to force platforms / software developers etc. to STOP forcebly putting AI onto our devices. I use WattsApp to stay in touch with family. Meta has installed it's AI into it. I can't remove it. It's clearly scanning my conversations and send data to Meta whether I use the AI or not. I WANT THE CHOICE to have it installed, or not. It should not be installed onto devices and plugged into software by default. It's so wrong. Windows has it. Apple has it. All our phones has it etc. etc. Stop it! Just Stop!
UPDATE*** no, a petition doesn't exist. There are no laws around what they can do with AI. That's scary AF!!!
youtube
AI Moral Status
2025-07-26T13:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyM2EVnQ1C4Dul5Ek54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzi-J9J4UsLI03go7N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy2U4jfR48n9ubG0Ct4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwxRq8wGH87O1g6_Cl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxIShkJM3ZnD1xaAyd4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwbMvCFQe_AU_-PK8l4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxxx9x3Z_PMsjaS-qp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyj87lBuFIPuUKmsUl4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwbhFB1yu9lDHghdn54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzcjsfREla-yyG-hv54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}
]