Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My biggest concern would be that a student using AI in place of hard work, grit,…
ytc_Ugz5feAIn…
G
My first recomendation would be that we don't create these problems in the first…
ytc_UgjyZ86Qa…
G
Well if the majority of musicians have been a bunch of sellout lowlifes who are …
ytc_UgzPkP6tX…
G
You’re talking about ML (machine learning) not AI when you refer to “sorting you…
rdc_oi23gdb
G
Any AI project is an unwise path for humanity. What can be gained from creating……
ytc_UgzmK57Jy…
G
To be fair, I am polite to AI just in case they ever take over, I want to be on …
ytr_UgxKF0M66…
G
It's a complete crock of shit.
We lawyers at r/lawyertalk and r/lawfirm discuss…
rdc_n5gov3y
G
Help what? Fight is lost. Look at how we responded globaly to the past devestati…
rdc_emny7fc
Comment
The only "good outcome" to this is if the technology wildly succeeds, becomes efficient enough that it's worth the cost and environmental impact, and then - most crucially - laws are passed to tax companies and those who profit from this technology so that a sizeable portion of the productivity gains are returned back to the workers they replace.
Ideally, if AI can do a job twice as efficiently as a person could at some point down the line, you could pay the person that you replaced the job of one share and pay the investor a second share, and everyone would benefit.
Of course we all know that's not going to happen without major political reform.
youtube
AI Jobs
2025-12-23T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugw_-gf1jMgcej-IdZB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw-g0429cb8E5w45rJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx-khGAJFu8yczLBbJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxAY4KqqOknsCtIETp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwtZSDpBtGF1egIB5B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyp18FowqC7u52dCLl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwzGpUnFF0qSwcob_54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwByvod2usYQXQlAp94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzs-cAWDsM-TUgLqTt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwuwbKBjLSjeBKeE2B4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}
]