Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is a threat, thousands of jobs are automated before AI, imagine it after AI i…
ytc_UgzMgsQ9s…
G
Yeah there were alot of red flags but you can’t really blame the artist. She was…
ytc_UgwKmI3l8…
G
YES as Krystal Two Bulls points out this DOES indeed affect our none human relat…
ytc_UgzZWm3g8…
G
>it's the owner of the automation, the land and resources that will benefit g…
rdc_gliiks5
G
the problem is the people using the AI, not the AI itself. AI is just a tool…
ytc_UgyxJMbdP…
G
dude there's also a good use for ai if ai didn't exist we would only have to ope…
ytr_Ugw9tOl5D…
G
Also, once the AI system was programed to lose points for killing its operator, …
ytc_UgytJwNxc…
G
Haha, that would be quite a sight to see! Who knows what the future holds with A…
ytr_UgywoO4KD…
Comment
Jobs will shift increasingly to service sector in protectionist roles (i.e. protecting existing wealth roles, like law and finance). What these people always forget, is that in an adversarial context it is not AI vs human; it is AI + human, vs AI + human. Companies will still pay a premium for the human differentiation factor in non-commodified sectors. It becomes an arms race of lackeys.
Until you solve the labour-wellfare dichotomy (with a UBI), the inequality will worsen, and more more people will be forced to compete for goon-like labour, at the behest of capital.
youtube
AI Governance
2026-01-27T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzR2ZOdWfJQ5rlyf8l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw9PCispgH71hkMJ1x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgylzeksQIF6ejtwn3J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwKwFILmBYPFaS9TIp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyGKoZrCCq77Wtq7bB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyX2ocjv3xL9HpGc3B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwQ5-xJ7Pkb29sacld4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzHyl_rm_6Hjc3oXa14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyyzNaGW7x3LIE8KrF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy_c6u1DWZtgQXkdW54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]