Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They openly admit that they are at a wall and cannot improve accuracy without in…
ytc_UgxC01rF9…
G
21:36 Trying to say humans could be smart enough to keep up with ai would be li…
ytc_UgxLpunVj…
G
What he actually means is that Trump and his little cronies have messed up being…
ytc_UgyNcOH4w…
G
Democratizing the field and lowering the barriers to enter isn't an argument.
I …
ytr_UgwbmYHEk…
G
En las manos del ser humano esta crear lo bueno o lo malo. El gran riesgo sera q…
ytc_UgwF8b1A2…
G
Ok, so:
1. AI art is soulless, it will never be as good as a human artist
2. AI…
ytc_Ugy5rhCjj…
G
When asked to cite a single disabled artist they flounder, they don't care about…
ytc_UgxW5fq0W…
G
Far from having machine panic. AI is critical in my career, and it’s the sole th…
rdc_mlh6esm
Comment
JUST SO IT'S CLEAR THIS IS SUPPOSED TO BE A TONGUE IN CHEEK POSTING ....FOR THE MOST PART
1 the people designing and building(shouldnt it be growing) are too greedy to worry about safety protocols, their concern is winning the competition while filling their pockets, or in their arrogance, they believe they are the chosen, the top of humanity, the most intelligent, god-like in their own minds, and because they built it, they can control it.... when indeed they cannot. and the end result is humanity will pay for their very expensive mistake
2 if we are a simulation, best bet a super ai is what is in control... so why on earth would we be building one in our simulation? And what on earth would it allow us to even contemplate ways to hinder it? What if someone did figure out a caveman solution that the smart AI just ruled out as too stupid? Well, there went that chance...Of course, I remember someone had created a computer that maintained its energy via bio waste... lol.. dont see how anything could go wrong with that concept! oooh wait, what if the thing in control is that....then we are... biofuel and death would be ..ok I want out of the simulation! .. lol the best part of the simulation theory is that all arguments against it can be squashed by leading the argument back to its a simulation. The end is the beginning, which is the end. what was will be and what is will pass to be what was....JUST IN CASE anyone know how to order an instant upgrade?
3🎵🎶WHO WANTS TO LIVE FOREVER? NO-OOO-ONE🎶 Life would have no value, we only value things because the supply is limited, once its unlimited... it has no value. So no, I would not like to forever. Not to mention.. look at the damage we have done in general, so ... what damage would we do if we were given, say, 10,000 years? Would there be a world left? As far as good or doing good, mankind as a whole is designed to be selfish and greedy, it's part of your basic self-preservation urge... so would there be any good left ? ethics, morals, would be gone, man may be born innocent and there's the crux, so to speak. with no births, which I think is humanity's way to renew the supply of all things good, innocence, kindness, wonder, generosity, compassion.... no birth, no refill... nope, no a world worth existing in.OR i guess due to our instictual drive for self-preservation at any cost, we just might stop some of our insane practices such as destroying the planet, burning through our natural resources like crazy madmen, just because we can...warring and killing each other just because we truly dislike or fear things that aren't clones of ourselves....but by then wed just have the additude that the universe is a big place. Now we have the technology so we can just find another Earth and go there. There's a reason humans weren't designed to live forever, so maybe that alone should be reason enough not to.
youtube
AI Governance
2026-03-31T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy5pIE9HkxIa9ZmiMF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx-4bRS7gwuwvt71rN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxb0_zUJWZArqEHmk14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzGxb_8OGjsH-aYlt14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxp7QcDkJBGHOZPqdl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxGmw8VpbKIoVmOp8B4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwfIaMuNJqiUwpExPN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzFGoe-ilpvLxtBf4F4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy_1aPnZBDqHUani4h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyLQooXPCOtULsyrSx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}
]