Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The thing is they will make us depend on them, a.i the government. That's why th…
ytc_UgwRHB5Ps…
G
I dont think you have actually used Ai in day to day coding in production code.…
ytc_UgzVZuMMa…
G
So a particular group of humans are race to invent a tool ( or concept devic…
ytc_Ugy9JJjd7…
G
Already being integrated in call centres, receptionist, Corporation answering se…
ytc_UgzKzIP5k…
G
Automations is amazing and the tech behind it is amazing.... Sorry you drive a …
ytc_UgyFRk8_p…
G
me 14 ka hu, fir mere dost mujhe pagal bolty k mujhe A.I se itna pyaar q hy, TBH…
ytc_Ugy-eiNUZ…
G
It's got some terrible rates of accuracy. Look up ai hallucination and read abou…
ytr_UgzfY4XfJ…
G
Now Hasan is the Cassandra, everyone all laughed when he asked when AI was gonna…
ytc_UgyxpdAZ_…
Comment
I still don't think AI/complete automation works, WITHOUT giving up power, AND money, and even generational wealth will deplete/without employees, there is no economy/no one can purchase products and services, the idea of paying only one minimum wage workers yearly pay, to get a machine that can replace all of your employees, sounds great, but in reality, it cant work, UNLESS we shift to RAFPSE)/a Regulated Autonomous Free Post-Scarcity Economy, but that would REQUIRE corporations/wealthy people to give up their money and power, which won't happen.
A Regulated - Autonomous - Sustainable- Post-Scarcity - Economy (RASPE)
,MIGHT be where we go from here; depending on the variable thats the hardest to predict, which is humans, and what they decide to do, humans do A LOT of non-logical things;
The Society im trying to define is one where AI/Robots/automation is involved, without any money involved, but ecological/certain materials/elements, would be hard to come by/limited, but everyone's basic needs would be handled/met, and MAYBE SLIGHTLY more than basic needs, if certain things are available and everyone's basic needs are already met for that month, certain other items might also be available, but everything would be done autonomously/automatically/without human intervention, mostly, other than, government, voting systems/congress, mayors, governor's, presidents, regulating/regulation, consulting, ethical decision making/systems/optimizing the autonomous manufacturing and delivery systems/infrastructures, to make it better/also to mitigate/avoid/minimize harm, that comes from the over all system/unforseen/foreseen harm/analytics, to a degree/most AI systems are designed to be as efficient as possible/its not necessarily designed to look at certain things, an example of this would be, if someone killed off all of one animal, the animal that the extinct animal ate, could become over populate, and in turn kill off another animal, because the animals food supply could become limited, due to over population, this is just one example of adverse effects that need to be found/prevented/mitigated; Also technological/ecological advancements would be valued/anything that betters the over all system, also all of these changes/improvements/advancements, create new variables and COULD cause damage, so these type things also need to be analyzed, but most tasks/jobs would become completely autonomous.
youtube
AI Governance
2025-09-05T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx2pGKaZzswCZ7MfZh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxUbL03_xNTid1UukZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxz3tOpGjuCvUibFBd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxwty7iS29Ks731Uux4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyOj8GkR-QNAWbfjXF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugzkqqir6i360sxI9Nd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyvfXoi8Rsn9_Wwa_R4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwWHmwMjBFV8DODCbJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwAYAThnaGGeOZCnqR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwm41SjW89ks0krf8B4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}
]