Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@YolkieYolk I mean, I got to make the promt, put in every single detail I want …
ytr_UgzhpXii2…
G
Humans really want to be special... they want to be above animals, yet being mam…
ytc_Ugyt73CwE…
G
@lwjpilled they refuse to accept AI art during competitions and do not repost t…
ytr_UgzPa4lwG…
G
11:05
Yes thats a detailed button. But i also have older cloth with basically a…
ytc_UgzZMLwiL…
G
Have some more Mary Jane and God help us all because the robots(AI) will take ov…
ytc_UgyDmMIMP…
G
He just proved AI art has no soul. That clip of the guy and girl is the most bas…
ytc_UgxShMPWG…
G
I think they're desperate to make AI viable and not be a complete money sink. Ev…
ytc_UgyL7y2IB…
G
One thing I like about AI is that they won’t bring in their political and person…
ytc_UgxxeFmsE…
Comment
AI isn’t going to replace 90% of jobs or become some uncontrollable threat. It fully depends on electricity, the internet, servers, and limited rare minerals. Without that fragile infrastructure, it simply stops existing.
The real risk isn’t a runaway super-intelligence it’s our modern system itself: too complex, too dependent on technology, and unable to function without stable energy.
Humans need to relearn the basics: real skills, autonomy, creativity, survival, and community. The future isn’t about a race to super-intelligence, it’s about humanity remembering how to be human.
youtube
AI Governance
2025-11-28T09:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgymyYlEKzfd80auW914AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz2G8nKwQCVoehktuN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxZfZS9EbQORa-5RVB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxn6ufb43Wo0eCPHlJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyZ2q2BLlgEYLEhE8d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyer1gSElGazA5MGK54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxJ_nZu5MopVwq4BjB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwdfiGYV6-HVRW_K7N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxWhLUA-80CfBRZTIB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzHVDghiedvkYGvAkp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}]