Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The vision: Humanity lives communally and in peace with nature, on the outskirts…
ytc_UgxAgBwcq…
G
It should be obvious that Ai is (literally) a soulless mimic of the human condit…
ytc_Ugx-7bIRC…
G
The thing is that, if you would put this up on a regular page with text about th…
rdc_mtgfht9
G
For reasons that are my own and I wont go into, I do not believe that humanity w…
ytc_UgxZIdgf9…
G
The people can hit the industrial and manufacturing and production sectors and l…
ytc_UgxLdNUU1…
G
Britain will not have AI future technology as it requires a lot of energy and wi…
ytc_UgwNL7F5y…
G
In stablecoins, so yea, which is another reason I'm bullish on crypto as its th…
rdc_o3hcolf
G
Did anyone else spot the AI voice used to read the copy in the news report snipp…
ytc_UgwYqwzZV…
Comment
It seems to me that there is a strong disconnect from reality here. The assumption is that people are broadly ready to move beyond their basic needs and strive for some kind of “lofty” or abstract goals. But this applies to a very limited group of people. The majority, frankly, would prefer a simple, comfortable existence, unburdened by high ambitions or complex meanings.
Therefore, in the confrontation between AI and human intelligence, the main risk is not that AI will physically destroy humanity. It is far more likely that it will render humanity unnecessary and irrelevant. Not because people will disappear, but because the meaning they create and carry will cease to be needed or valued.
youtube
AI Governance
2025-12-14T11:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyhfLwoZdDegIiXZmt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzac2vfqBChAz5Z22d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy6qIocFihJ7o4PreB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw2-hAvZejL7ZaPP4V4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyYKeDir91jGBS_9pl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwSfYU_z3Phxy_3EVN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxC_FpFCVoRln9uYdV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyRiC3XiXGyrumATnF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzkRZK1XEk-UfueSG54AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwMZmErOeMygkP5tRZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]