Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Just think what if the robot takes this destroying human thing Into its command …
ytc_UgyX-Fh5B…
G
Such arrogance. I can name many jobs that will not be replaced by AI in 5 years…
ytc_Ugwn9wlwr…
G
We're totally going to have battle mechs in the next few decades aren't we?
We'…
rdc_ezfeqjh
G
So if society turns to giving a simple system for providing products and food et…
ytc_Ugyg3C57M…
G
People are spending more money for computers than feeding kids. Ai should eventu…
ytc_UgxP8_u81…
G
I'm not an artist by ANY means, but AI "art" is so stupid man
Learn to draw
Or …
ytc_UgzPEvWkB…
G
I would rather sell my soul to Valentino to (from Hazbin hotel) then show my C a…
ytc_UgygStGyC…
G
In all seriousness, I don't believe kids should be allowed such easy access to u…
ytc_Ugx2BkOuR…
Comment
Not a single one of your presuppositions is obviously correct. The human brain has been designed by one of the most powerful algorithms in the world, the genetic algorithm. It's already near perfection for energy efficiency. Your brain uses about 40w of power. How much does an OpenAI datacenter use? And that datacenter is stupider than your brain. Also, we already have superhuman intelligence in the form of LARGE GROUPS OF HUMANS like openAI or Google. 80,000 engineers are smarter than a human. So what? The question is will the datacenter be smarter than the group of humans, and will it use less power or more.
youtube
Viral AI Reaction
2025-11-23T21:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw3NggXDMRM3in86KV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgximSBe6DiYmtgzRcR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwFi2WS1DmKIHK8FsV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugzr7Eib6FAbL6W2Kix4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwrE0ShyxVTqY0DU594AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxzXRoPAB19W4LThyJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyIGNhYvgMPUS578tV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzjG3YkXvoOhndHzPV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwKPrtiA8mXqqiOlgp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyPM7vnUHW2x8MWYp14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]