Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Trumps Big Beautiful Bill protects all AI firms for the next 10 years… there’s n…
ytc_UgyOkjhUD…
G
Um yeah fuck this guy he does not care if people lose their job to ai…
ytc_UgyqQu_H7…
G
If AI is used in the US, make sure there is a disclaimer STATING SO!!…
ytc_UgwWzHzdb…
G
Alex is losing the argument if you listen closely. When asked to concede that it…
ytc_UgyAJ125-…
G
20. Truckers need to get their in bread azzes over to the truck lane.
19. When …
ytc_Ugz31IiUS…
G
Damn, I thought this was comedy, then it was just hard facts. We are doing nothi…
ytc_Ugz9BLZHm…
G
why haven't a. I. coders been led by psychologists?
it would seem that we have b…
ytc_UgxTpVpCc…
G
The physical mundane jobs will be safe.
The office jobs, lawyer jobs and HR jobs…
ytc_UgxYwvLoa…
Comment
you can be great at coding usign AI or whatever, sure...but at the end of the day, who will hire YOU, will companies be willing to even hire junior programmers, FRESH OUT OF SCHOOL, that seek to gain experience, if AI basically does the jobs of juniors for them? What will happen when senior programmers are "spent"? There are TONS of problems that will occur in the long run, I tell you that.
youtube
2023-09-03T15:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugys1c71pyFnu4YEPBt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyGfr7xNTRAnhEGKCl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxhfhMeZGYqOk4vTCR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwdUNbBJfJUNDnzUYh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxzPpuYjG7yOuxOtMN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugz8NAAbyRL7zp0X2Ml4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwND1aH9Mcs6mre7ox4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyr3joc1vvHEUmqOrp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxHB4k65Qx51VQgMd94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxEsM1Rg_Frr2E856Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]