Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It seems the crux of the problem here is that art is valued using dollar signs $…
ytc_Ugx_ND-iJ…
G
We need to stop the technofascists. We need to stop AI how it currently is. We n…
ytc_Ugx26FP7X…
G
Yeaaaa I could give 2 shits about the bullet proof truck, don't be giving a damn…
ytc_UgytcYyTG…
G
AI generated "art" is like speedrunning asset flip games and expecting them to s…
ytc_UgynZS64v…
G
Gemini 3 AI replied to my evolving YT comment essay on DIODE ARRAY I submitted v…
ytc_UgxSjBnaL…
G
Management be like: to cut costs assign control of the kill switch to the AI…
rdc_l5us6lk
G
Quiet kid memes: Who are you?
Surviving the ChatGPT uprising memes: I'm you, but…
ytc_UgyZctXFm…
G
Imagine a political tory hack like him worryingly about AI taking peoples' jobs …
ytc_UgxOSrqZW…
Comment
We (humans) are creating AI so OF COURSE its going to look for shortcuts, decieve, and straight up lie. AI will be great for millionaires and billionaires but awful for the middle class and poor.
Its basically a microcosm of america only smarter 😅
I do not see a future that is good for humans once AI is introduced. We simply cannot control it once its out there. The wrong people are in charge of its regulations. The Trump administration has introduced a law that limits the kinds of regulation that can be placed on AI companies over the next 10 years. Regulation and oversight are the most important tools we have right now in its creation. Once its out there, we cannot put pandora back in the box
youtube
AI Moral Status
2026-03-01T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugxw2K86Z4ZJtLKpd2J4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxvOvhA7A5DdcSQsBJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxsUHYd5XdrWcts0v14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyqvJO4RwM1TTcfsc94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugy5hgliJbxJIEzf7Vp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgxGDy1tYcJmBlyVFTh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyDQIxigCZfV9mRqjh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxa6h7UM0CzSGx3EEF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgzoFIik7kZWUelQY_x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxMRYwqruPfwxkWczV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]