Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Holy shi this is the best sentence "If you do not know what you are doing with t…
ytc_UgyoI-AiV…
G
@JoeGeneric-xr3dk did you even read anything I said?? I'm against using artist…
ytr_UgwAMhO8U…
G
In my opinion, one significant issue is our tendency as humans to compare and at…
ytc_UgyGsbsbZ…
G
You can't compare what was with what will be. All past economic systems were bas…
ytr_UgwOw47HU…
G
Let's give proper love to philosophers: but they're scholars and decent scholars…
ytc_UgzBzszkg…
G
If my CEO uses the decision-making algorithm he learned in B-school, why don't I…
ytc_UgxAruUQO…
G
Shouldn't the self driving car not follow a truck with exposed cargo in the firs…
ytc_UggVWhydM…
G
The issue is guaranteed basic income at what level? Enough to buy a comfortable …
rdc_den22dh
Comment
By 2030 every AI expert believes 90% of tech based jobs will be replaced by AI. Then shortly after Service jobs, then shortly after that manufacturing jobs. By 2040 developed nations may not hire ANY humans. Which means, if humans are to win against AGI socialism is our only hope as no one will have a job. Or else the entire world’s power will fall into the 10 individuals who oversee the board of whatever company wins the AGI race. One company will hold more power over the world than any government could ever hope to.
youtube
AI Jobs
2025-09-12T17:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy2VtmrcVHN9KYUkdZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx7BqhR0k5vjo8iTAV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwRcU_pIqBRxnhRMhx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_Ugzumfsyb1jvS-FV5dl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwkFo08o6z3UKYvRRV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyo2GK0dE62nr_pdNx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgyWizzpDA2e9vMHgA94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzsNbFyiI30DMLmsCJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzsNZnHMP9ZMDN5LUx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwV7wrSV-qt5pKyLpJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]