Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is not the first time Amazon has done this. Amazon executives like to think…
ytc_Ugw6b8_YH…
G
@andordimeny6130 your acting like ai is to the scale of the fucking fossil fuels…
ytr_Ugzo0U-dE…
G
Why would you fight a robot ? Has everybody in this world gone mad ?…
ytc_UgxR1ifcm…
G
and then in school these babies will be likely bullied for having the mom be a r…
ytc_Ugzc_U9Lg…
G
This is literally the first time I have ever heard of this. You ARE projecting. …
ytr_UgzgOhYPc…
G
The video was cut short after the robot turned to gun on the human and open fire…
ytc_UgyI7SWYn…
G
One thing I don't like about the silicone skin, skin is naturally shinny. Silico…
ytc_UgyC-9hvZ…
G
Look, the real fun starts when nobody knows how to fix a code generated by AI…
ytc_UgyzoYXDu…
Comment
Three questions: (1) Will wealth from AI automation all go to just a few billionaires who pay very little tax? (2) If that happens how will governments have enough tax income to function? And (3) If no one has jobs who will buy the products these AI automations produce? The only way I see civilisation surviving this is if governments tax AI automation heavily and use the funds to provide everyone in a country a universal basic income. But what if they don't?! The other way this could go is massive inequality, starvation, and conflict due to civil uprising.
youtube
AI Jobs
2026-02-28T20:4…
♥ 7
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgydUG-YtP0lPYCKBIJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgzsLC2M7jGD_1KMtzZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxFkgR-TTPQIdkJ9fB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzudQ5WKaEtbvptGN14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx9W-SOSv4DymqjGr54AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzhTwDGd_-WIB9DP214AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzXD3i-n1hFyc6rVd94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzNs7Id4NbASNDNXv94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_UgzYyFf_bXYHah9NFaV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxgqlr8kO5O3r4YzGt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]