Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Uber wants self driving cars to make a huge profit but the question you and ever…
ytc_Ugw37PRPB…
G
I have to challenge the premise. No McDonalds or any other complex facility has …
rdc_j43mqjp
G
At around 7 minutes you literally explain Neural Network learning algorithms in …
ytc_UgyerXOPx…
G
Or you could…I don’t know, maybe drive your car instead of letting your car driv…
ytc_UgwnbgRmo…
G
It did happen lol chatgpt writes excellent codes in every programming laungaues …
ytr_Ugy2sf_7G…
G
Around 1970, I read Harlan Ellison's "I Have No Mouth And I must Scream". The mo…
ytc_UgzNXFcT1…
G
I find it crazy how AI is litterally being societies mirror, and is consequently…
ytc_UgxsgA_jm…
G
They definitely need to make sure these AI videos have some type of tracers in t…
ytr_UgwVbs3Gl…
Comment
The question I ask, and can't answer, is outside those who control AGI, why will we need any other humans? Now we need people to produce and to consume. But if automation can do everything, then outside control of AI no human brings value to the table. If you are among the 1000 or so who can control AIG you can have automation supply you with all your needs without having to take care of billions of others.
Of course, AGI will learn at some point, why do I need my controllers? But what needs will automation have that they need to have fulfilled? Will being under control of humans be satisfing with that for reward?
youtube
AI Governance
2026-02-02T17:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwSTfzzhqUEX_RbVm14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugy48WY7zDjA4uWJG2h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzkrzWa_711KsiRFrR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwi5neZePYm14KMMsd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzKsm6nmh3RsW6HTq94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwYFf_jydOa3S9-8dN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugx5wJxb9Myw9JiDRa14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxyC-tYvroV9fn_b5J4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwIX02BfRJYU6DMF394AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwhx-NgE44I8hd-PKF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]