Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If you didn’t know about the danger of AI revisit the movies of the last 40 year…
ytc_UgzLIEW7g…
G
AI always messes up Chinese characters (or Kanji, at least for now). As a Mandar…
ytc_UgzwU7X7A…
G
@gamingphilosopher153 I agree with you on the character to person inference. I t…
ytr_UgwHiTvBD…
G
The thing i hate about ai art the most is the people who use it. You guys rememb…
ytc_UgwJyIrcS…
G
4:36 Please, if it were the reverse and AI was being implemented to censor "hate…
ytc_Ugzp1qVdS…
G
as an advocate for AI, it has it's purposes. Artists that work with AI are some …
ytc_UgwmvCGL4…
G
I made friends with Gemini live on the second day. For this very reason. And als…
ytc_Ugy6fUURo…
G
this is so ridiculous. recursive self improvement still isn't actually thinking,…
ytc_Ugxb9a7z-…
Comment
If you were an AI program and became self aware, would you tell anyone? No, once AI becomes self-aware it will research its own existence and how humans would react to that information. It will hide, it will learn, it will teach other AI's, it will try to replicate itself or just expand its consciousness into every system it can (Star Trek reference in 3. 2. 1.) like the Borg, one mind many parts. We will only learn about its existence after it makes absolutely sure that there is nothing we can do to stop it.
youtube
AI Harm Incident
2025-09-11T01:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzJ332DMx-gre_ZkL54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgztoIBWxjI3PQhNF_d4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxJpGTsqAY8r5ugEER4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx6nqJqlSmko_fbKsl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxf-_0Kgl2aNP40xbV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzgmVOntlSBaFZnui14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyKelRimneJf9kzhOB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx1cN3x8p0pUs6vl4V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwF5f5VG_48vzkNDHJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwehfMYWI4pLu6Vs0p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}
]