Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I work sorting the shit work facilities in UK we will always have a job. It' guy…
ytc_UgxQkAvUA…
G
ai is trained off of other peoples art. so you are not creating art, you are gen…
ytc_UgzPT_Wqw…
G
To prevent AI from worsening global inequality, we need proactive policies focus…
ytc_UgyqwcWvA…
G
idk ai art is fine to me and the farther we get from its creation the more peopl…
ytc_UgxC3AU8B…
G
Does anyone remember the movie Maximum Overdrive? Instead of a commet, it's goi…
ytc_Ugwnx076F…
G
Greedy AF as always. Use AI instead of paying a real talent to save money, and t…
ytc_UgwccD4z8…
G
REALLY WHAT IS HAPPENING IS THRU AI N ROBOTS GET RID OF HUMAN NYTY, SO DONT THI…
ytc_UgztulIvO…
G
hear me out about the customer service-humans get it wrong. it's just that with …
ytc_UgzSLgJdS…
Comment
People, throwing out UBI as some kind of solution are simply not thinking enough about the problem. If A.I. accomplishes what it is likely to be able to, "MONEY" won't be universally used. The corporation that figures it out, and is evil ENOUGH, will essentially become, to the world, what google is to "search engines". It won't be the only one, but it will be the biggest and it will have its fingers in everything it can. There is no good outcome for A.I becoming super intelligent given a long enough time scale. Human existence will become meaningless, slavery, non existent, or all of the above.
youtube
Cross-Cultural
2025-09-30T02:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgxX5_KrXpFfSs9ex514AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyHVp0RVLqSnIjCIv94AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw0ok6QBi8gWABrnWh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzX_9G3wxaf9lWMjlZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzBSDd3X-ChdYLFplJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx2jRkCydQ4SPJPJjN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgznIJ_BNqDRlSI79C54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzbODXkAnIIQeEcUJl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyTGngXQSYfGpIwguV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz0CyMcxJwkZq9Bkvh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"})