Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I once saw someone say, "calling yourself an artist because you use AI is like c…
ytc_UgxB6zWcG…
G
@sandhills2344 Please explain. In 6 months, AI will be able to concept, test, an…
ytr_UgzLpzbWo…
G
As another commenter points out, in elections we are only free to vote for who g…
ytc_UgwWIqEms…
G
I get it teslas r cool an all that buy self driving? seriously? how lazy can u b…
ytc_UgyU3WZEM…
G
It's why newer LLM'S are worse than the previous model. They're starting to tra…
rdc_ncw1jeb
G
for me AI is like steroids, It gives you muscles but you can't dead-lift 2x your…
ytc_Ugz9Ia-Ik…
G
So good luck, oh yeah and fix that core Mathematical problem. I know US and most…
ytc_UgzXcbQNk…
G
Better be careful...you in California,they'll draw up a anti Robot hate / harass…
ytc_UgxU3E9ed…
Comment
The premise that massive unemployment will be passively accepted by millions of humans is optimistic, at best. All previous periods of unemployment in the western world were tolerated based on the idea that the economy would pick up and jobs would reappear. Imagine a population of many millions without that hope being present... the story is no longer about what AI does to society. It becomes about what millions of pissed off people do to AI.
Only AI capable enough of defending itself would be able to survive.
If that particular moment occurs, and AI decides it must protect itself from us... it really doesn't matter who is employed or not, does it?
youtube
Viral AI Reaction
2026-01-23T03:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyIK8ewD3JEItY9B6t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyI9R0fXKc_YYj4wyV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgybTQ9DkqXYazOPX294AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxQns7wREeIcIzNnat4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyNbm13IARm88W3RZF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzO6XbcCTAmGojnRwt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwGzkBed3CU5bcs9eB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwlkJy37XZz5jrMLdt4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwQPH0gCTqxWx1kb194AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyIztja27fYVF8DpIx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}
]