Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Nothing we have today will ever ramp up to come close to that ability. LLMs are …
rdc_kyh5ho6
G
I am more worried about my future dream of being an artist with this AI thing be…
ytc_UgxsOuotr…
G
Sorry Max, number 1 is Ukraine it’s all Ai generated folks now so being one suck…
ytc_UgzKbjkzT…
G
I get so angry seeing naive boomers believing AI videos all over u tube. Thank y…
ytc_UgwtnclYb…
G
It’s actually horrible to think in a few years or less everything could be AI…
ytc_UgyL_FIvm…
G
The last time unemployment got to 30% the economy crashed - the 1930s. Once tha…
ytc_Ugzn3yK3y…
G
Humans will never be able to create something that has its own will, even the sm…
ytc_Ugz5QEUSe…
G
I prefered AI robotic kid, for those who cant have kids it would be great!…
ytc_UgwX8GDL5…
Comment
Here's my take:
Regardless of whether AI will directly harm humanity, the resources that the development process is consuming is harming humanity. For this alone, I feel it should be stopped until we can produce these resources at a level that humanity is not affected.
We have Data Centers that are consuming huge volumes of land, water, energy and financial support that are needed to provide for our current populations to say nothing of the future population. Who really benefits from the development of AI? The general population? A limited group of people? Or a select group of people.
Unfortunately we can see from our current political situation that the only thing that is assured is, Death, Taxes, and Corruption. I believe this is why the popular opinion of AI trends towards negative. We humans know we are corrupt. How then can we make anything that is not corrupt?
youtube
AI Moral Status
2025-11-05T00:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgztBbiG85c-1g2F0xB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgylSPOKDURX-zUOnoB4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},{"id":"ytc_UgzdTSPqGcLA4aaEliZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgzJ55MxLLBgB0gLFAV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgzxPXBbQ5FhGPkc1UN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_Ugz3AoAH-hITwsj4_bJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgwxQYCSYHAlHvq3wUt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgwYv7samR-oMnjN3Ll4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgxVQVka6UbHPNDRRd14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgyuVlEGR0Zaus7p0wF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}]