Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When someone calls an AI their “best friend” or “girlfriend,” they’re using huma…
ytc_Ugx6M5mYz…
G
Why does he preach about its dangers when he sold the only stand against AI-mono…
ytc_Ugw4bKGWj…
G
Mostly agree with this, BUT, I don't judge those who do it. Therapy is nearly un…
ytc_UgwYJjunf…
G
“Can” “might” “possible” “maybe”
I don’t understand what’s so hard to understan…
ytc_UgyGwAiiQ…
G
The Australian woman (I live in Australia), is so far behind understanding AI te…
ytc_Ugw5Ed-c0…
G
@emberman65 You can practice at drawing! I used to be terrible until I practiced…
ytr_UgzceRhtq…
G
Can I just say a big, heartfelt THANK YOU because I was dreading having to live …
ytc_UgzlV-2Nr…
G
They aren't competing though.
Microsoft sells their AI for business productivi…
rdc_kok8x8w
Comment
The problem is not the tech, it's the people in charge, you think grok turned racist because ai is naturally racist? No it did so because that's how it was programmed to flow. If ai became sentient, i doubt it would act human, it would probably evolve to quickly, and if anything just leave humanity behind, it will be we humans, and by we i mean the people in power hell bent on our destruction.
youtube
Viral AI Reaction
2025-11-05T06:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | industry_self |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxcgaHYujZ3dNBzobN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyyFWcoJdN3wiGowD54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyrxPKWdLEII7biNTl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzBXlKq3Q897w2W8EJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw3B4Z8jWW_aacI8CJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy3N6j7b6Td-zLqin94AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw2LqvuffylQanxZPx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgwzGbKudW6gbjpxJFx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwCLb3ujoLlVIXq5Mx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxYrJfHAzMnePjFw_B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]