Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The sci fi dystopian stories are becoming more real everyday
I can't even be m…
ytc_Ugw3FJ8Lr…
G
@pbhandsdown1046 Whilst they are being vicious it's because it's their livelihoo…
ytr_Ugx1Xg5kB…
G
Ai will never be ai, itll always be a script. Do you really think the govt and e…
ytc_Ugx_zpUut…
G
Yeah it was more fun when it was “ai generated IMAGES” not “ai art”, I saw a sma…
ytc_UgxXo_O-X…
G
Ai will replace everything and mark my word one day you will be jobless and even…
ytr_UgyK2fphI…
G
Well no, I think the problem is that when you defer to AI, you don't actually le…
rdc_oi156vh
G
or maybe it's because they're trying to sell an AI product of some kind
whene…
rdc_mowmhv8
G
This is stupid. I have done the same thing when I thought I would learn to draw.…
ytc_UgzHXXqsn…
Comment
This will happen to some extent because businesses will demand it. The insatiable greed especially of western organisations means they will absolutely look to develop AI to the extent where human employees will be almost non-existent. Company bosses are looking to make 100% profit and the way to do that is to eliminate human workers. What will happen to redundant human workers? The universal income idea is far fetched. Given the cost of everything in the West these days, the universal income will have to be so high, and the politicians will never agree to it. We are coming close to a dystopian reality where humans will have no work and have no purpose and no money, while 1% of the global population will live in luxury.
youtube
AI Governance
2025-08-12T21:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzHbyHr8BQmKOJI_8t4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyfv3_yck0fEbd-vIl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugzb1_gtmPpOHb6sXWd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxlaLVtkoMqseLSwN94AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwx6T6j_PG_4hmoZ2x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxmen0r82zywpa0aT94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzhNbZtrih6h9sxn1Z4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwOx40P27mm7BJIWAt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxvatfpCv0Y9hZ4x1t4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxFudW6sfQhYS5ANwx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]