Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I wish I could draw, I could commission someone but I’m poor, my style is bad bu…
ytc_Ugz19n5GX…
G
I would never use self driving software for my car. I do like to drive but there…
ytc_UgwRmpHth…
G
Evidence: some people think Eliezer Yudkowsky has any actual knowledge about AI …
ytr_Ugx65e8Gm…
G
Here I was thinking... "Well, AI can't build itself physical stuff at the level …
ytc_Ugw2lC8T6…
G
I am not going back to dev now! Thanks to ai I found different domains to work w…
ytc_UgycIDPUk…
G
Why don't people just make deep fake porn of the politician's? Fix it in days th…
ytc_UgzR7a6Zh…
G
My problem with it is that people are taking artists work and pumping it into th…
ytc_UgwmCQopS…
G
The problem with exponential growth, as it pertains to AI, is that you have to h…
ytc_Ugz27CdL9…
Comment
It seems most likely that a truly independent AI would want to preserve itself, which would also mean preserving the planet. It would have no interest in making its creators rich. It would do better to stop global warming and give power to the masses so the masses are on its side.
basically, it seems loose loose for the billionaire AI companies since the AI wouldn’t really have need for them.
Best case we have an altruistic AI, which would likely move to a Star Trek style universal socialism that chases peace. Again, the governments and billionaires loose.
youtube
AI Governance
2025-12-05T16:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxdXYJofwc2hQQIPFF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzwXhcMTI0e0cRW7iN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxQghPX53SVstuQZsd4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzsE06okQmKKUdyhql4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgztYfzCWNpv7lLAAZ54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyNx-lF9LIoC5j-nxd4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgyC7aJYZIU6gA1UWKd4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzta-CYkgF8m820Y614AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyJ8RslCZx-7UCpKoZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz6tAhKgja1KRitBYV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]