Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@Awes0m3n3s5I mean but it does argue the same point as the original comment, and…
ytr_UgwOoqVqi…
G
Do people really think there's going to be a way to regulate ai? Dude, this is P…
ytc_UgwpmtEvn…
G
They also forgot the end user like me at 62 who have AI on my computer but find …
ytc_UgyyU27In…
G
AI will be misused to control people. It will be controlled by one person who wi…
ytc_UgyUvUv1F…
G
Ai (that will become AGI) is basically telling you that we'll be taken over and …
ytc_UgxUl7KpI…
G
As someone who has worked in the Information Technology profession since 1977 I …
ytc_Ugyg9mR6b…
G
AI is nothing more than a scare tactic used by the elite globalist to frighten a…
ytc_Ugzh8hrN7…
G
We want perfect autonomous cars and medical technologies but these Idiots build …
ytc_Ugwll2VSt…
Comment
Lets say we have about 2 billion able-bodied adults on Earth. It will take a vast amount of resources and energy to produce robots and AI that can replace all the jobs those people are doing. And then you have all those people who still need to eat three times a day and all the other resources and energy they need but they're no longer doing work. It seems extremely wasteful and inefficient to replace humans with robots and not have work for those humans to do. I think the future may have AI and robots doing a lot of jobs, but humans will still need to produce something of value otherwise they just become "useless eaters" and the globalists will eliminate them (which i think is exactly whats happening)
youtube
AI Jobs
2025-10-08T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugy74pt1CThEUtxKvB54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy9BFVqjHdC3fMwJR94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzZVyqlup-7lmqpveR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzC037CI0lCfgeKZZ54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzi49e2KR_7fIUK_vR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxgPo-JGNZOqw2LPm14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy7q7tr9dcrjvPcIg14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz-Z676tf9qO-PqKfR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxTYW8OdfHKq5t0dVN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxTWgoY6v80byLoZT14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]