Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It’s insane we keep saying AI is gonna replace a bunch of jobs and people just a…
ytc_UgxeFjzwh…
G
Chairs dont look like that, they usually have the feet pointing upwards.
Hats u…
rdc_oi43kyq
G
I don't talk to ai or chatgpt for any curious questions they're actually creepy …
ytc_UgyF8pD4h…
G
The question of whether this particular AI is sentient or not is secondary in my…
ytc_UgzdvLXz1…
G
So your point is that AI sucks because you have to learn how to use it?…
ytr_Ugx6ZbNG8…
G
I swear AI artist tend to forget that even if someone was naturally talented or …
ytc_UgxUCE-bI…
G
Since I was 8 I was always fascinated by anything computers, after 2020 things e…
ytc_UgyPwOXwF…
G
AI isn’t the problem, the problem is with the agenda of those who are developing…
ytc_UgzXPyslC…
Comment
The issues is they're trying to remove mundane intelectual work, instead of going after progress in medicine, physics, energy etc.. Maybe that's the right path, to improve the ai first, before attacking those important fields, but we will end up with an economy where a fast majority of people will end up unemployed, probably 10-20% or more, and in that economy I don't see how governments are going to continue existing in the same way..
So are you going to tax people, when there are no secure jobs??? What are all those unemployed people going to do??? We have to rethink how capitalism works, without completely demotivating people to work. Maybe some UBI that's enough for the basics and working gives you the luxuries of live. I have no idea.
youtube
AI Jobs
2026-02-22T10:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwiDO0nKdQTuUUEVOp4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugyn-r8lj6-wlb3C3C14AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzwYcuw_BnSXshBJW54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyr9exzfZNt24M1brZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzKVPO_wEvYKjXkkbJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxzDULKn64i_kNNurF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzTi3Td0e0_rzCKaGF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxP7o2JfggD_YGWNox4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzr2PKm8UDaj34ZmCR4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzy9vrUe88DErOzUYd4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}
]