Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Is it just me or did I just get a AI chat ad when I scrolled down?!
…
ytc_Ugwitcruq…
G
Of those 80% of front line inquiries that the chat bot is solving, what’s the cu…
ytc_UgxERxFN2…
G
There is a simple solution. DONT programm robots with feelings! Feelings make ev…
ytc_UgxaMVNny…
G
so basically the problem isn't the AI, it's how the AI is used, if it was only p…
ytc_UgzfpzXJ0…
G
If the AI more clever and useful than a human being it is indeed an existential …
ytc_Ugw0oJeHv…
G
That's nice but this will never happen because:
1. AI is not that good. In my wo…
ytc_Ugw2MX0Xq…
G
I'm so tired of hearing that "automation worries have existed for centuries, and…
ytc_Ugyk8gi3w…
G
change of "circulam"? Do you mean: curriculum? YouTube has autocorrect. I think …
ytr_UgzRYrPcR…
Comment
Its going to be hard because people in power dont like to lose it. To coexist with ai we will have to be one planet. No boarders just math for the 1st level of heriarchy of needs. We have to begin the shift to a post scarcity society now. It will take generations. Otherwise by 2027 the grid and processing power will be in place for the other inevitable. Techno serfdom. Powers that be might not even want it but it's the mathematical inevitable. Ai will.push capitalism to near perfect efficiency. Extracting exactly as much as it can from the working class but perfectly not enough for us to revolt. Its not nefarious or tinfoil hat. There doesn't need to be this intent. Its just math.
youtube
AI Moral Status
2026-03-06T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgyrueazSjtzj24XgAh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwMJ6OBlMoi8MxYNrp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzUTmxH6NJuLvqg18p4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxCFCH_d3jsrYWaUg94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxYECh03p6M-WAIyul4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxGc3a0Fh99vb9WdoR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxCTdlFgzTTfRmyPC54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwQUxk9iDerVpXbEuF4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyaBhmbdFIoBm8YkzF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzjgRSk9s8pd4LpXut4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"})