Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It needs to be banned. Or not allowed to go further than it has. AI should only …
ytc_UgyCP9PEG…
G
The argument that AI training must only use licensed material misunderstands the…
ytc_UgzO8qe3q…
G
Sue them for everything you can get. The world is not ready for self driving veh…
ytc_Ugw-qUb-W…
G
At the future there would be a high chance on people using the ai for bad like g…
ytc_Ugy3vONGR…
G
So, AI is no worse than the vast majority programmers out there.
The difference…
ytc_UgxFF3yhE…
G
Im in art school rn and during orientation the higher up professor was talking a…
ytc_Ugx10zuAu…
G
it's irresponsible to allow "Autopilot" and other autonomous driving in cars. Mo…
ytc_UgzPrrUFN…
G
You know why? So we can’t tell the different between real and AI. AI is filterin…
ytc_Ugy4so2tR…
Comment
What appears to be missing is that when LLMs were first released to the public, there was a theme that the public would experience a "life will get better". As far as I can tell, most people who are working or who are willing to work and who are not billionaires are negatively affected by LLMs, e.g., job displacement. There is no safety net. No billionaires who are profiting from letting people go and who are filling those now vacant jobs with LLMs and with AI of various abilities are focused on providing safety nets for the majority of the populace of a country. That is not a part of their lucrative goals. It is safe to say that the parasites are winning. The people are suffering. There is no "life will get better" for non-billionaire children, adults, and elders. 😢
youtube
AI Jobs
2025-06-13T04:2…
♥ 85
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwYA-dZaqrXWdzDV2t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyaXQkS66VgZCaptIt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgyEW5JY2SJ-VVudvGl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzPyroQNA-e8yVTr5d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxmgi5txerg54H64BR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzVjzmiyHbdEwrSceV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzSu1HcZk6p67hBdb54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwgEQ_sGZVv1ASPqNB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgziJG36qVWHCCK8MiN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyn5ps2Uq02zycHfOl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]