Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We got to be careful about how we approach this though. Used to work in a call c…
ytc_Ugz1LA1zR…
G
The thing about ideas being cheap, I see some people use the AI to come up with …
ytc_UgwDw_7nv…
G
Yeah, just imagine being one of those coal miners Biden told to "learn to code",…
ytr_UgwBuCgjW…
G
"That's it? That's all we need to do to ruin AI art? Just putting your art in a …
ytc_UgzEuGZ-O…
G
The techniques will not improve because the current generation is happy with ai …
ytr_UgxcAiptT…
G
13:37 I think it would be funny if it were a different person every time, but th…
ytc_UgzYojPDc…
G
people, probably a good idea to write a letter to whichever AI you use and ask f…
ytc_Ugya988w_…
G
So I gotta make the drink 🍺 put in in the robot go to a whole different room too…
ytc_UgwjCjClq…
Comment
The problem with the universal basic income is that governments will have an ever-declining tax base as people lose their jobs to AI. So the only way the unimaginably large sums could be conjured up is by making the tech businesses - and other large corporates benefiting from AI - pay for the UBI. Probably across every jurisdiction in the world. A lovely idea in principle, but I think it's far more likely we descend into dystopian chaos with poverty and social unrest.
youtube
AI Jobs
2025-07-14T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwhtEqgXg2xt48QYbN4AaABAg","responsibility":"elites","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxrgu0B0YpoqeQew394AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgynT0KGegwJskuOhXh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw3GanHLiSasCA5CI14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwi0i0cABvoKUsX3EF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxYL_YWVzjkV6ElmYd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyFW-2C-Yj31qa-cOV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzgnkQor0fgh9PFWOZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw0-o-ES19kRvl6ryR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxKSn_MawZIug3ylQd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]