Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Automation is inevitable for all jobs. Stopping it is like stopping the wind. …
ytc_UgzBgBzys…
G
to draw on either digital or traditional, you still need skill and a style. usin…
ytc_UgwLNm97u…
G
We already have manned drones slightly bigger than in the video with pistols. Th…
ytr_Ugw1QqYYb…
G
We appreciate your interest in our content! If you want to delve deeper into the…
ytr_UgzhITmTD…
G
A chatbot that generates human-like responses is not going to take everyone’s jo…
ytr_UgymewbOe…
G
People should hide bad language and cleverly disguised porn for the AI to stupid…
ytc_UgxpBqv2C…
G
I always go to the comments to see how many idi*ts think the vid is real and the…
ytc_Ugwr0nCmS…
G
Actually, it is scary how we are moving forward.
When ai has the same capacity s…
ytc_Ugxo74INf…
Comment
It's not about AGI. It's not about SI (super intelligence). It's about dominant AI. Military and/or corporate.
The race cannot be lost. Humans.
Yes, the race will result in AGI, SI and Dominant AI.
Yes, they are all going to lose. There is no winning the race.
Yes, a unified human effort could retain control. Unified human effort is not a thing.
Odds are....the faster the better. The odds of SI being as stupid as humans when it comes to being and existence are slim. The odds of SI being malevolent are slim. It's simply not intelligent.
The question is, do humans survive the race?
In all likelihood, dominant AI (which would no longer be AI), will be the best thing that could happen to certain humans, other than being saved by an ultimate god.
I say certain humans. There are a lot of humans who would be in big trouble.
youtube
AI Governance
2025-09-04T20:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwKBvp0HZ8LXlm6XdB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgywNX1ueTFyvq3RiH94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy3GMTpwA2B3stEbpR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz_1p0YkJb-xwkAVOx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzGWi6jsdx9YS8NCKx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwMUepTkP4yPYMOcQB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwZyBtfJuO-Ax2NE794AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugzte4Xeo9U6aVp301x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxVpfbWUlVNRCPojzV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx94qFqP0ljlK0SHfp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}]