Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Whats with the boobs? How do they know its a girl robot? They just assume its ge…
ytc_Ugyfonh_p…
G
I have had a lot of discussions with AI chatbots about humanity's fears regardin…
ytc_UgwzUjfSz…
G
1:22:00 "I don't *believe* it was decieving him" well, i don't believe you have …
ytc_Ugy54_8ct…
G
We are just delaying the inevitable... it will not only affect artists but also …
ytc_UgysL7A4-…
G
I hate and love AI. I am an untalented wrench when it comes to drawing and paint…
ytc_Ugwo2aKfF…
G
These are quite daring predictions for 2025/09. Anyway, I feel the dates do not …
ytc_Ugy6ljr8q…
G
Source: Heavily Redacted Report
Initial Outlet: Guardian
Secondary Outlet: com…
rdc_d0eyefr
G
@justacutepanda888 ai is not "taking inspiration" dummy. It has data sets, the…
ytr_UgyH3KZwA…
Comment
To be fair, those are some really bad prompts.
The first one was correctly interpreted by the AI, if you say kids, that means human children. If you want a cat with kittens, then you ask for kittens, not kids.
The second prompt is even worse, because AI doesn't always interpret negatives, which is why you should always write prompts in positive, like "make the kids cats". By saying "Not human kids", the AI likelly understood "No, human kids". Since the first image was already what was apparently being asked for, it just generated it again, just slightly changed, which resulted in this funny coincidence.
So no, the AI isn't racist, people just need to learn how to make half decent prompts.
youtube
Viral AI Reaction
2025-06-22T17:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzAN5HR5JF-WDonp-x4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxQzrPYSyULD-YBGQR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyy6fNFBB9g_F19x0Z4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugyq6IU1DkGi7H_NSuR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw8EJEhd9F-cKH-Krl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz-Y5R5itfLzYo1B3Z4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw67tQsGJf7Y486qwJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugynvwde5LYCrkvzdTl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxeF_MheLxlE2ONNfl4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxe7uzgOcRCLltVH2p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]