Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It continuously blows my mind that a government would ever allow “autonomous” ca…
ytc_UgxU5Yx9m…
G
Chatgpt is very useful. It does have a friendly quality to it and it remembers e…
ytc_UgxXEQ6fC…
G
5% are the HR implementations that don’t need to be right just make a choice.…
rdc_n9h6av1
G
AI is gonna win. While these people foolishly strike, AI is creating content!!! …
ytc_UgzSeeKs8…
G
Should Autonomus driving vehicles be required to have a light to infrom other dr…
ytc_UgyZyjIKM…
G
AI will see us as a waste. We use too much resources to feed ourselves, pollute …
ytc_UgwFvUiIf…
G
elon musk invents the ANTICHRIST AI ROBOTS ALL STATED IN THE BIBLE OOPSY always …
ytc_Ugzg58pns…
G
Wow, it seems like the dialog between the presenter and the AI robot left you sp…
ytr_UgzWYmoKH…
Comment
Here's my two cents. Achieving AGI is like mass reaching the speed of light, you'll keep adding nines for infinity until you figure out how to make a warp drive.
As close as we get to AI broadly replacing humans across the workforce, until we have AGI then some aspects of every market will always require humans.
If anything, the demand for technical professionals will skyrocket as AI continues to improve; I think this is extremely good for society because more people will have the opportunity to specialize for a high paying job.
Meanwhile, AGI would either exterminate homo sapiens sapiens, functionally make our species go extinct by evolving us into something like 'homo deus' using nano-MEMS and/or keep some/us as specimens in a "zoo."
youtube
AI Jobs
2025-10-08T03:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxqhHhvLIrij4QjC7x4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxVmmTkPXui6HZNYpB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz_UMewRrLLCYhXBWV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy3WGauMU8jShADdad4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyeaFed62-uZhaVqbJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxun4k30lkDtHQB8n14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy7RF5T8s_iDk8oZV54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz3ZjgnSvlvKnqT2DF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyD1Ae4TrJ0tyTFlih4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz-DaSCpEipLa9v5EN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}
]