Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We as Christians believe that the image of the beast that the antichrist will br…
ytc_UgxkeRaK2…
G
I wish this guy in the middle would get off the stage. I believe that this is a …
ytc_UgzgzLUsX…
G
Ways to teach kids life skills.... We took it out... We used to teach them job s…
ytc_UgynWXbfK…
G
GENERAL PUBLIC don't BELIEVE that AI robots can eventually take over 99% of ALL …
ytc_UgzjQbSXF…
G
Based on the Tay AI engine. How laughable to think this guy was working for Goo…
ytc_UgwifjRHw…
G
before jumping into AI run need to realize the responsibility, doesn’t matter if…
ytc_UgxULHiKo…
G
Wow that was crazy! Left me feeling worried …going to go make friends with AI . …
ytc_UgzWZRrVS…
G
I ask Chatgpt , did George floyd had criminal records , answer yes drug ,teath a…
ytc_UgwFFk1mp…
Comment
AGI and robotics like Tesla’s Optimus will eliminate huge portions of the workforce, starting with white-collar jobs. Anything tied to databases, procurement, logistics, finance, legal, or insurance will be fully managed by AI. Competition between nations like the U.S. and China guarantees rapid development, so there’s no slowing it down. Some argue “just shut it off,” but with multiple independent AGI systems worldwide, many won’t be turned off and some will be used maliciously. Blue-collar jobs will last slightly longer since robotics lags behind AGI by 5–10 years, but within a decade they’ll be gone too—leaving very little traditional work for people.
youtube
AI Governance
2025-09-04T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzG8qxBH8Jn1J5vjal4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxL7wjCnPov_0F2e_x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzYRJbsg_XI7_975s14AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzG1hGPP6vTfWohphJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzxSidap1D5PSLQXfx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxA8KsdJhJbLgpyjn54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzXFf67CpKwRc58r8V4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgylJ8UaXTpp7H4Ef7V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwcgEHAq3RlGR3bMIB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyjSfFzUCwSxsxCdqF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]