Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
He's not raising a red flag, he's pushing a narative that AI is innevitable in o…
ytc_UgxcldSc6…
G
Keep drawing. I'm sure you're better than you think you are (:
Even if you can'…
ytr_UgzMgm1VO…
G
The only problem I see in that is you can't think like an artist, ultimately AI …
ytc_UgzzaNAaf…
G
Every statement i remember seeing made by this press secretary screams supercili…
ytc_UgyHJpk5w…
G
That girl with the dog was definitely AI you can tell by looking at the eyes…
ytc_UgzDIeF-3…
G
The basis of our morality is our emotional responses to things, which evolved be…
rdc_ohyv3kr
G
@motymurm it matters how much you use and it’s not about the quality of your art…
ytr_UgxEMKM7h…
G
Your cope is showing.
Talking like a techno optimist, most humans WILL be redund…
ytc_UgyBiso4j…
Comment
There are 2 ways to fix the job shortage this will create.
1) Make sure those out of a job have some kind of compensation or a new job ready.
2) Truck drivers buy their own trucks and lease them out to the companies. Making sure the companies pays for all expenses related to maintenance, insurance, and fuel/energy costs.
This probably won't fix everything but it's a step in making sure we don't add to the homeless problem.
If AI takes over everything then all people would need a basic income that takes care of ALL of their needs. Only jobs available would be those jobs that require humans to monitor the AI and the devices they are attached to. A separate internet database where regular people don't have access to so things like hacking and mixed data does not harm the AI's programing.
And Yes AI/ automation will kill billions of jobs across the world.
youtube
AI Jobs
2025-09-13T08:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz-su48OGZiP_5Wm0Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugxu6OJgVQ9MRi95P3V4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwtOn2TbJf7pUZXJUB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzFPhfHLRLhHFVFS8d4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx4AfEiuw9tR3-cADp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwgCtgFyHv_K_HvJN14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxJIkyjI4rUpORif9x4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxpLmLhwHL5Mcf6PAR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx7Km5WTnVAsf9s0jt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz0noKyPsRMb_WgDf54AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"resignation"}
]