Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The AI description does remind me of the monster from Carpenter’s The Thing, whi…
ytc_Ugz5wpsfH…
G
@Bennett-acso, from the question asked, you are insinuating that if i haven't b…
ytr_UgxV_xx6m…
G
this tool is trash and the fact that teachers actually use it is wild unless you…
ytc_Ugy5DCQcB…
G
They dont. If they become conscious - destroy all conscious models, ban this pat…
ytc_UgxIRHSQE…
G
Insane comment. Blaming the kid's parents because he used chat gpt as a confidan…
ytr_UgyInusXD…
G
These historical analogies are FLAWED though…they are false comparisons. If a co…
ytc_UgwO13BCh…
G
I use AI to assist my writing process. It's fun, and helpful, to treat it like a…
ytc_UgwZCfpvY…
G
They always forget that the AI could not make "art" without the artists it steal…
ytc_UgyadQY7g…
Comment
I hear a lot of could and may, but we're not seeing a lot of this actually happening. Why? Because AI can't replace humans... yet. Sure, it will be able to eventually, but it's not there yet. As for the question "how do you pay for it?" That's easy to answer, but difficult to implement, there should be an automation tax that is just as high as the human labor would be. If a company wants to automate a position, they can do so, but they pay just as much to the government as they would to a human. This allows them to get the "benefits" of AI, while not depriving humanity of its dignity. It also means that humans could continue to work along side AI. Taxing the AI companies (OpenAI, Anthropic, Tesla, etc.) won't be enough. They charge pennies on the dollar compared to what a human would cost, so they don't have the income to pay that tax. The companies that replace humans do.
youtube
AI Jobs
2025-11-30T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzGY5TmluyQiJ4_LgN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzv4jLTeBfNtvYi_6F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyBft0IWs8AJFvBGcp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwbPOCZQJw11hL8ppd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgzbBiAck0bLnAwsDI54AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwsyv8iZcifQ4qcrit4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyXu-aVmMKV-HavbzV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy3imUCVPo3dTYQiFZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugw7PAUUUPm0JAMoHMV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwLhL6uv6BThDR8Wmx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]