Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't like ai art,I also don't consider it real art, it takes all the effort o…
ytc_UgwVgQuAf…
G
Can a submarine swim?
LLMs pass almost all reasoning tests we can throw at the…
ytr_UgwhJHE0X…
G
Funny thing is, Trump’s tarrifs would have this kid working in a factory making …
ytc_UgyNotmPO…
G
Why are people trying to treat a LLM like its alive just because they gave it a …
ytc_UgyOBUdTn…
G
Now I empathize with the ppl who were cheering and applauding the destruction of…
ytc_Ugw2s0ARf…
G
boii ts so tuff boiiii ai shocked artist rocked ⁉️‼️🥭🔥🤑
all seriousness tho thi…
ytc_UgyAouABz…
G
Yet the evidence of AI "self" training thus far has been degradation of output i…
ytc_UgxEzoavf…
G
Yes ai is stealing when replicating styles, there's a large difference between s…
ytc_UgxCVV-tY…
Comment
There is a cure for those people who believe AI is sentient, and it's called medication, which is generally available at all good chemists. These hard-core Star Trek card collectors, try so hard to convince themselves that Ai has consciousness, when in actuality it’s a very fast information harvester, that has the fundamental flaw of struggling to differentiate between data and instruction.
Renowned mathematician Sir Roger Penrose gave an eloquent talk of this subject, that busts the brain candy bubble that so many are being educated in to believing in, just as cloud computing are not white fluffy things floating high up in the sky, but rather on a corporate severs harvesting away.
youtube
AI Moral Status
2025-07-10T10:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxG_2dOGXGrdzrtVAt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz00bDi0EdKRSjcQLp4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxzDYjjSVZBKQ8_sM94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwJP3siMjow47Ia_Qh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxEVXq3ILCJ6wNb73t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxG2yiFitT0naZkTDh4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwv7ER5lSUykkj6H8B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx_W9NyEu8jBHmD7Kx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzbmdGC59Wb34mC9I54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxCCnBiHZ1uTi2u3U14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}
]