Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
LLMs are like hyper google search for anything that has ever been posted online …
rdc_ohz6l6n
G
I am a mid level engineer (4 years experience) and doing a masters on the side. …
ytc_UgzjCKBQn…
G
This is insane, people are trusting a grainy camera with their life. Musk should…
ytc_UgzwOBsq2…
G
Gooodh h3ll I did it all correctly and ai is very scary
( can I get pin?📌)…
ytc_Ugzfm2hZS…
G
A lot of ND creators do disclose — I’m one of them.
The issue isn’t shame, it’s …
ytr_UgynJzRLC…
G
You need to seriously seperate ai users. I train models based on my art to allow…
ytc_Ugx5EgTwl…
G
Nice but if this technology is not controlled it reach a point of wiping humanit…
ytc_UgwWDoFKp…
G
I'm pretty sure I read an article where a single doctor had "reviewed" a kabilli…
rdc_jtetnpj
Comment
Automation *should* be a good thing. It should free us from the repetitive, dangerous, and/or otherwise mundane jobs and allow us to pursue more meaningful activities, both personally and as a society. And people who want to keep doing those automated jobs should be able to, without financial worry. (in other words, survival should not be tied directly to employment, but that's a society issue, not tech.) I say should because that is clearly not was it happening. Automation is replacing us yes, but those other pursuits are not available.
I'm with Steve here. The tech is not the enemy. It's the people who implement and weaponize this tech against workers that are the enemy. The tech is a tool. How we use it is up to us.
youtube
AI Jobs
2025-12-23T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzvBQ8ksUEN_nEyTH54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzsFHsmRI8ZEznZBw54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwN96M0weHDNCMK8Yx4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"approval"},
{"id":"ytc_UgxozIdf8zIyndB9lQ14AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzZtZQMXjU8UwbVJy14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzyFklaU2QzdkxHi6p4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyFW0pqMiC45Ue9SXN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyhHkvViguxuZlJF054AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz-SGdEzY4o66j2MaN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyYXILf6tx22aHe6ot4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"approval"}
]