Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I was quite excited about AI art when it came out. I'm decent at drawing myself …
ytc_UgyqRn1TQ…
G
@aspenwezen sort of. I'd be using AI to produce animation of a book that has zer…
ytr_Ugz0h-nDG…
G
i never quite understood the ai harming the environment allegations. at the end …
ytc_UgxPdnh8n…
G
Plot twist: there’s people behind the scenes being paid by the government to voi…
ytc_UgzGmPlSu…
G
Here is my opinion on AI safety.
We have already hit the point where red tape h…
ytc_UgzbgvpBa…
G
@3dholliday You're critiquing them, when clearly YOU didn't understand what the…
ytr_UgwFBRhG4…
G
Sooo noone remembers months ago when they we're saying th AI face recognition wa…
ytc_Ugyda5ZU5…
G
I never thought that in my lifetime I would see a lifelike functioning robot ind…
ytc_UgwFzcDNA…
Comment
the standard lesson of economics history is that automation creates new and more jobs so the problem of the jobs that destroys is minimized and converted into new and more exciting opportunities for people.
however as Engineer, let me tell you, that is not true for AI+Robotics, because this is not an automation, is a generalization of automation, meaning it can replace not only existing jobs, but also all the new jobs that are created with minimal effort, far less effort than it will cost for people to adapt.
the new jobs that AI is generating are jobs that require a lot of skill, and it will take more effort for people to acquire that skill than it will take AI to replace these new jobs. Well, this will happen with AGI, however, LLMs are not there yet.
youtube
AI Jobs
2026-01-29T18:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxzL-ekWt2JMp6ZQwJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzTv0H6hv25D1iKBpN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz1IkAzR2cgEY7qUKF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgykcPXSP5wgD2EcgTV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgztoCk85IYUxPfgSsB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzdpFgSWbwbWFt6qf54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwwmDFLyJwdje7eGN14AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwqpqDeR7o6hOSs0gt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxG9s9QrYr8ZPlgJGt4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgydghGNnPsXbMsijb54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]