Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They should just make an Ai learning from the terminator movies.
At least this w…
ytc_UgxKlZZkP…
G
Sam Altman doesn’t know what will happen to larger global society once AI takes…
ytc_UgwpeVPuN…
G
A quill of thought, a heart aflame, I write my verse, a whispered name. Born fro…
ytc_Ugw007FT_…
G
just look at the bolshevik genocide & what the people responsible did to the rus…
ytc_UgwWe-2FK…
G
So not only cops are racist
Now AI is being racist?
What next?, Siri becomes sex…
ytc_Ugx_-3ifw…
G
If AI does all work why do humans need to learn? That's a prescription for wides…
ytc_UgxRCQX5z…
G
its not talent. Its a disregard for all the practice and dreams that people put …
ytc_UgwPZGsw_…
G
we we will be used as energy. they'll hook us up to exercise bikes and we get p…
ytc_Ugw9PlqIJ…
Comment
Unfortunately, at this point it's delusional to think that this technology will be used for the benefit of humanity as a whole and not as a perpetuator and aggravator of current power structures. The working class already exerts little to no realized power compared to the elites even though they are the cog in the economic machine that makes anything possible at all. Once workers are obsolete, they will be set aside to starve while the rich will live in even greater luxury with robot slaves that never sleep, demand living wages or go on strike. Unless we do something about it NOW
youtube
Viral AI Reaction
2025-11-22T20:1…
♥ 892
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugz7IkOR2UExrftNjPt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgygVjHxQrw1HCHaOzV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyhefhoxkZfz4GsACB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwfaKoV_J3PMjI4GR94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgzJg4sTwfTpapzIKux4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugxg9p48Wjx5YDoOmwV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwW_iW1fto4qifHrb94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzasXPsTpmdbIYmE9l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxcuHwOc96dGfXepNx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxSfjnGft9qyHe_XgR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}]