Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A far more fundamental rethink & redesign of society's political & economic syst…
ytc_UgxMfenEl…
G
They are stealing from our humanity right now! We’re all being used to train AI.…
ytc_UgzFFo_QA…
G
If it were to become conscious won't it try to suggest or make changes? It woul…
ytc_UgyFYhLKE…
G
Coding isn't about labor
You need IDEAS, the humans wont be needed is Bullshit …
ytr_Ugw2r-qD5…
G
animal rights first. when we can treat our cousins, the other animals on earth, …
ytc_UgglJFam1…
G
Dependencies cause more security problems than writing it from scratch, and AI w…
ytc_UgxeykeHq…
G
No, they aren’t as low IQ as you think. They will pay the AI digital money now w…
ytr_Ugx21oZR1…
G
In the Warhammer 40k universe. AI means Abominable Inteligence xD (Basically in …
ytc_UgzCs9-SP…
Comment
Knowing human nature, why will those who control the robots want to build them for humans who will have no purpose but to consume? It would be easier and simpler not to build robot to serve 8 billion emotional, trouble making humans. When they can build enough robots to take care of all their needs for the controllers and their families? And if robots can work for free, why would we need money? Control of the robots, not money will shape the future.
And don't think you can just shut the robots down. With AGI they can think a thousand times faster than you and shut you off before you can shut them off.
youtube
AI Jobs
2026-02-20T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwI92fA5SrALfYyUTB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugww1XojyRhKvC3Mb9V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwaQIM07y-f0XzXKrF4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugza7wiYpqlfkA-0Spt4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugx3sJk52IXv2Nfq6tZ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgzB7oL20f-iaPfpjuZ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz3vFXRFmBDun4WoYp4AaABAg","responsibility":"user","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyWEuaWNG9OM4yVnJV4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugylaj5EUoeaK57R-xl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwBaPwoI7hytk_sb1t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]