Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Pardon me, but why would giving a super-level AI to everyone be a good move? Do …
rdc_m96j2ic
G
There have been movies involving AI where they got real better and smarter. Term…
ytc_UgyzqGjeL…
G
Omfg I got a google ad right as they revealed Magellan was owned by them.
Once…
ytc_UgzAe_HIB…
G
robot: what is my purpose?
me: you are cum dumpster.
-sex- robot: oh my god…
ytc_UgxIbVfqQ…
G
When you B@××DY Gentleman, had the chance to make a change, in your bloody natio…
ytc_UgyRfInTg…
G
The expressions and it thinking seriously about the consequences just sent chill…
ytc_UgzqfTgDB…
G
Ai is an extreme echo chamber and that, is the central tool they use,...it becom…
ytc_UgxuoN2DG…
G
If Black folks do the same, make AI videos highlighting the racism, the injustic…
ytc_UgymLzpLV…
Comment
In the hands of large corporations, there is no good outcome of the adoption of AI for regular people. Either it fails and the US economy crashes, likely triggering an event worse than 2008. Or it works and everyone loses their jobs. Nice! That is not even mentioning the immense environmental impacts of this technology. These AI companies are putting their weight behind rightwing forces in politics that almost certainly will not expand the social safety to help workers displaced by AI. The logical conclusion this massive worker-displacing technology, and no restructuring of society around it, is millions of people losing their jobs and ending up on the street. The Luddites were right.
youtube
2025-11-06T21:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxAts-yKFvpo8F8z654AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgyRA0XxXbGPdHa2g614AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgxVaolnlMt1e674R9t4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgwrhYu6Y0Mx8goLdRR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},{"id":"ytc_UgzyyEALhciIAHGjrDR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugwf_O_Wd78KtcTrqmB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgwV4ndCnvOIj5-m6WB4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgynwGDm8xLC2Aqu6Qh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgzjhELTCs_KvkOvD0x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgxfoaIICVG7Vhp46fR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"}]