Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Well, at least you'll have the satisfaction of defeating AI opponents while dodg…
ytr_UgzvFVM8n…
G
While there certainly are limitations this is simply not true, tools like claude…
rdc_nbhphzr
G
That’s not what happened. AI identified this man. A judge signed a warrant for t…
ytr_UgwDn1UYY…
G
LLM's are dangerous because they are trained on unfiltered Internet data. It lea…
ytc_UgyqBsep7…
G
Lets master self driving cars first before we put it in an 80,000lb machine that…
ytc_UgyzbtQNO…
G
What's funny is that the biggest streamers like XQC and Asmondgold could be repl…
ytr_UgyfNW8fU…
G
Every time there's a question brought up of "can self driving cars handle x?" In…
rdc_cpn8te0
G
Un hermoso busto
De repente decorarán así parques y plazas
Que metiche es
La …
ytc_Ugz2h2pFU…
Comment
GETTING THIS OUT
as of this moment in time
Humans are to know that a drone called karbul has killed the first human! it had received no orders from an actual person & overid & also hid the evidence...
Also
Four AI robots have killed 29 scientists in Japan!!!
I mean we have only just begun and look already..
ANOTHER TRUTH
Take the alphabet & start with
a=6
and then add 6 for each letter, now take the number associated with each letter of the word
C O M P U T E R
so,
C=18 +
O=90 +
M=78 +
P=96 +
U=126+
T=120 +
E=30 +
R=108
=666
youtube
AI Moral Status
2025-08-18T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugw_WsuXF_bXQ8mF3Cd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwmPd8JdmAi5ZdvS4h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzmON0wOlrj--jtBBJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwdPzzqKqdX6L5y7qB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyyid_ghsQv4PoPcc54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxFM0NwirVXvgBR7cl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwIV4F31q63ip-MhcZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyhiAiSOXcf_ecu3S14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzDQOZwoLmKpjEXfM14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz8pUcJizUr0pdQxh94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"}
]