Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hmm. Once the rich have robots and AI, they won't need the rest of us.…
ytc_UgxgNQFEI…
G
Elites blaming Ai for destroying the world lol. As if they aren't the ones that …
ytc_UgzQQD1DH…
G
Can you enlighten me on how AI is going to be a hairdresser or even a nurse???…
ytc_Ugy4uMM9r…
G
Well obviously the good models weren't trained on shitty code. You're just using…
ytc_Ugy1EIyUf…
G
If it gets any more accurate, greedy ceos and such will replace them with ai lik…
ytr_UgwSkxwfk…
G
came across this video randomly and omg I love your art style, deff gonna follow…
ytc_UgxqH5tsC…
G
she tried to burn the company to the ground and plotting to sell it to anthropic…
ytr_UgyGeNL8h…
G
Oh, that is not good!!
Fucking training AI soldiers dude they probably already …
ytc_Ugzprt3yr…
Comment
Imagine how easy we suspend our disbelief when we see the economics of, say, Star Trek while, on the other hand, completely lack clarity on whether people are able to share their individual abundance. This (dis)belief is rooted in our current zero-sum understanding of available energy in whatever form - money, commodities, land, buildings, vehicles, etc. An invention of someone could change all that in a single day. Who knows, maybe neural networks will supercharge us to a similar degree in some ways. Though I am skeptical that most people will truly utilize its power to build products or services. Instead, people will do what we mostly do - use it for entertainment.
And I get it, I am afraid of starting a business myself. What if I fail? Thing is, I think the rapid change of events we are part of will force us all to adapt and quick. Or be swept away.
youtube
AI Moral Status
2025-08-18T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugw4Og5tkqfLTT3uLpx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyBpE-AR0AvH7r9z-N4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyirftcM8bjUUkC0_F4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgyCPf3G_BoN94SKXTZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgymtQs4KyzNRDv-64R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz9pmHEziNiTXJFH5x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzRUe8lihVbZKefpIt4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyQZGO4rKEQC29Kl1V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwuWI89FEeHg0VTpfB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzccAU8-DTbOdCM10Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]