Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My talkie and poly. Ai chats are going to make me get put under government most …
ytc_UgxxWW4Vb…
G
I’m doing gardening as
Job while also learning about LLMs in my free time and cr…
ytr_Ugzp_OPF3…
G
Term AI is misleading. It is human intelligence formalized in hardware and softw…
ytc_UgyIojL9x…
G
Will AI be able to eliminate all nuclear weapons? Will it be able to solve all t…
ytc_Ugyvt17mU…
G
What if the false pretense had something to do with the support they'd be given …
rdc_cjooja7
G
As usual mankind is not in control of anything. Not sure how AI is ever going t…
ytc_Ugw2-DBZD…
G
I wanna say that the person and people behind stable diffusion also made a music…
ytc_UgzzKDiiO…
G
Imagine it’s 1940, and we hand an AI every scientific paper humanity has produce…
ytc_UgzXQmq86…
Comment
The surest sign of how problematic this AI takeover is going to be is that NOBODY (and that's true of this YouTube) is talking about what billions of people with no jobs are going to do to live. I mean, where is the conversation about how we pay people *not* to work? Are we actually going back to slavery? Inventing breaking rocks to keep people busy? This is so fundamental a flaw that it simply floors me that we aren't demanding answers to this question. Then factor in that the so-called "godfathers" (aka inventors) of AI have all declared it a potential species-ender,--their actual phrase---and it should be clear why I and millions who are informed about this issue remain profoundly alarmed at the complete lack of urgency about the dangers of AI. We're on a ledge, and not enough people are talking us down.
youtube
AI Jobs
2025-11-01T17:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzOwn2b4TvE7I5w94t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzsys-7o__T_KJVmWl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzkUadC_cT2IHBV5-h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyQQmWk0lh2ZhsR9Yp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxlTJdIwGZXJojuohJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz8QC-iBASoko5AgC54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzFoxlhGBoUhVQ7n954AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzfsc0GqMKnBtpoPrJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgweZ4S1M8XyCK1qDgB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgyG-cGOIpK8eSOcHXN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"confusion"}
]