Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I commented on this case when another youtuber covered it, the following day my …
ytc_Ugxc1GNh9…
G
Sometimes I wonder if one day the top 1% really do automate everything and AI ta…
ytc_UgwsVA-aj…
G
Except AI is producing garbage. And the only people who like it are CEO's who li…
ytc_UgwHwUUX7…
G
The speed of AI development is slower than our own nuclear programs in the 40s. …
ytc_Ugz4_SVCh…
G
That’s the major glitch with facial recognition, it has bias, it was after all d…
ytc_Ugy71QbdZ…
G
There is an AI being developed that basically scans cases to help a lawyer find …
ytc_Ugz3DAITw…
G
Ai kinda sucks it is useful but there should be a FINE LINE where ro stop using …
ytc_Ugx9zka9e…
G
結局、AIを脅威と考えるか否かは、当面、AIが どの程度迄 賢くなるなると思っているかの見通しの違いのようです。脅威だと思っている人々は、脅威になるくらい賢くな…
ytc_UgwtdjIuS…
Comment
@Marksman3434
Once all human labor is automated, people will still need a way to live — and there are a few main ideas about how that might work.
Universal Basic Income (UBI): Everyone gets a guaranteed income funded by taxes on automated industries or AI profits, so survival isn’t tied to having a job.
Shared Ownership: Citizens could collectively own the machines or receive “AI dividends” from automation-generated wealth.
Post-Scarcity Society: If automation makes goods nearly free, money itself could lose importance, and people would focus on creativity, learning, and relationships instead of work.
Most Likely Future: A mix — basic needs covered by UBI or shared wealth, while humans work by choice, not necessity.
youtube
AI Jobs
2025-11-04T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgxOvFNuY6lGqHU08zl4AaABAg.AP4ZzzTmHIpAP6SjIyb_bo","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwQvO7SYh-fFajMn9t4AaABAg.AP4Lwx5UKVKAPCMISwuEPh","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytr_UgwQvO7SYh-fFajMn9t4AaABAg.AP4Lwx5UKVKAPJqFYD9YXS","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugz4ytPanINm1Ypu-6V4AaABAg.AP4FdhKv4KpAP4ptrUG9MO","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxBwe_lsstzL9qUzjx4AaABAg.AP4AcDstxvGAP8AGgUs9Av","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytr_UgxZBfMvb1KbpkB9J5R4AaABAg.AP46V9AyyAlAP5wKWn9KTZ","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytr_UgyQzrWZpmaoPkvcIh14AaABAg.AP3PkV3meiwAP617ZTelIc","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgyQzrWZpmaoPkvcIh14AaABAg.AP3PkV3meiwAP74vLQujej","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytr_Ugz8mXUpGtLui_hmVQF4AaABAg.AP2xN6szaL5AP3_PwQLRJ1","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytr_UgzuGrGLDda2zmsHs6p4AaABAg.AP2Tf-61Ec8AP4q5wRpwjo","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"approval"}
]