Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Sophia embodies wisdom, which includes respecting diverse beliefs and perspectiv…
ytr_UgwKtKp53…
G
AGI (LLM) has no "consciousness". Don't even ask them anything about the experie…
ytc_UgxkbbLhM…
G
@WarpedCatWHY THE FUCK CANT I SEE MY OTHER COMMENT
OH FOR FUCK'S SAKE
anyway
i…
ytr_Ugz0Wmn6b…
G
Tried several automation tools, but ended up with Pneumatic Workflow for its rob…
ytc_UgxP_fC4I…
G
I’d have to say he’s sincere about his worry once he mentioned taxing these AI c…
ytc_Ugx1wgq-Y…
G
The globe aristocracy is planning our collapse and mass depopulation. They belie…
ytc_UgyBUcqhh…
G
I liked the.... plague mask... thing....? made out.... hair....? Youre wearing i…
ytc_Ugzx7auZb…
G
yes that's the point the point is for it to wipe out labor. just change the econ…
ytc_Ugwvkt5Cw…
Comment
In the 1920s it was automated machines, in the 1950s it was robots and in the 1980s it was computers and the reason we ended up with even more work was how with each big revolution new products came around suddenly need more people as complexity of products also rises with the technological possibilities. Yes these technologies are always disruptive, but people despair because we look at the current market and think: Now everything can be done by these machines. Yes everything known at this point, but then suddenly new products or even more stuff get added to a product and it goes on.
The first cars were built by some dudes, a modern car is a high tech product with several hundred parts with a complex supply chain behind it. Unix was written in 6 weeks by two people, modern OS have literal thousands of developers behind them. When suddenly AI enables people to code faster with less efforts we can also make very complex software like games which were not conceivable at this point in time, and suddenly instead of hundreds programmers doing code you need hundreds of people specifying the programs and what AI has to do. We go from one meta layer to another.
What often is badly understood is how fast complexity explodes. There was a terrifyingly good example in a book about combinatorical optimization: If you have a drill and want to search the fastest path from each position, there are n! n possible routes to go (if I remember this formula correctly ...). For 2 holes that number is 4, for 3 this number is 18, for 20 it is already 4.8*10^19 and for 60 we reach ~5*10^83 which already exceeds the number of particles in the known universe ... ("The biggest shortcoming of mankind is that we don't understand the exponential function"). Think about it how this plays out for products. If a product is made of n components which (ideally) have one single point of failure, the possible ways of a product with 80 components to fail is 80! which is also more than the number of atoms in the universe....
We have now ever growing complexity of products on the one hand and a big portion of people leaving the workforce on the other. It is not unlikely that despite all bleak predictions AI just came at the right time to be even able to manage development of products in the future.
youtube
AI Governance
2025-08-07T13:3…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgzE30HAQluEyeGI76N4AaABAg.AL-l63a0wjsAL0awoXzdZj","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytr_UgzE30HAQluEyeGI76N4AaABAg.AL-l63a0wjsALX3Yk6V-4Y","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwhT4v_y3p7DhfdMJ14AaABAg.AL-YtvTyzRqAL-ahSFsb2u","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugxl9AzofNxpOOoSiMJ4AaABAg.AKthxHIdv8kAKvtoctnVfW","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytr_UgxygOWcmwBzPO6Vg0B4AaABAg.AKtcDnKgHDKALAxAdMQy7I","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgxDpai-SqtvuZnqjYt4AaABAg.AKr7y1_AjLJAL7FKSd7kmc","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxDpai-SqtvuZnqjYt4AaABAg.AKr7y1_AjLJAL7FaEu1QWi","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugz67iDYjKR8HR2ii_R4AaABAg.AKpKozXGMCpAKryVWKi6ng","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugz67iDYjKR8HR2ii_R4AaABAg.AKpKozXGMCpAKwQQ3kD9if","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugz8CQcIgC8fk06zBcp4AaABAg.AKoT9Bpvgv2AKoX44dbtrC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]