Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Thanks for this video! I was streaming a video game and had to shut down the use…
ytc_Ugywt0Hzp…
G
Even Artificial Intelligence is predicated upon the established doctrine of ABSO…
ytc_UgymivHPU…
G
*A.I. is just an excuse to eliminate jobs... So that larger dividends can be pai…
ytc_UgzQbJJI3…
G
AI is only scaring people who are talented.
Just imagine someone not as creativ…
ytc_Ugx9mHv9j…
G
We need to drop ai for a few decades and focus on our humanity and arts…
ytc_UgyG8bIvE…
G
What i dont get it people going out of their way to hate on ai art people arent …
ytc_UgwSbtvQU…
G
Can we call ai artist ai generators/ ai prompters since they don’t create they g…
ytc_UgyeTnG6M…
G
If only ShitGPT was as willing to oppose nonsense in general! It has clearly bee…
ytc_Ugygb0r-G…
Comment
With all this talk of AI evolution, I think one of the most important things to note is that if the top 1% gets to determine what universal basic income would look like for the entire world population, then we shouldn't buy into their illusion of progress. Imagine a world devoid of purposeful human work and the empathic touch of humanity. What would our children aim to become? Where would the drive to constantly learn go? And if this basic income is to be determined, what would life look like socioeconomically? In my early assessment, we're already on track now as we're being primed for a subscription-driven society. Life would turn into a cycle of subscriptions, and that UBI would only be enough to just make it.
youtube
AI Governance
2025-12-08T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxdZ6obicZ679rFsZl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx5VZM7vqsOyGrh0YN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx5tNEuirSug106Ri14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyb9DF8UaM5EkaJxRR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyMR2qraTs8HKf_nLl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzq8l3DB_gE7HBtbXh4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyKMEYgyPj66nxs_eJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxfU8Ciu6YYPft9vMZ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwgOGAna6C4gApUHth4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzGc1xt39XvvtPXYnl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]