Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No chance give a robot a gun, remember terminator an will smiths AI movies 🫣…
ytc_Ugyfi_Te_…
G
I work with AI every day in tech. Its not great.... I spend a lot of time correc…
ytc_UgzdvGrIL…
G
The moment AI showed any inclinations of self-preservation, I'm surprised they d…
ytc_Ugx9exXJx…
G
I wanted to say something... but I feel like the point wouldn't be made and I'm …
ytc_UgzwDmIhD…
G
The sad part is there 100% going to be at least 3 people who pop up in these com…
ytc_Ugxs4cbry…
G
For anything related to ChatGPT and prompting, the way of life course seems a go…
ytc_Ugx76SVn4…
G
I took a quick trip to that subreddit, and omg it was so horrid😭😭 I cant even be…
ytc_UgxW4fxFa…
G
Maybe it shouldn't be called A.I., in that case; maybe it should be called I.A. …
ytc_UgxewImz9…
Comment
AI, like all automation, should be a good thing for everyone – a win-win-win, where we all do less work while still enjoying the benefits of the same or increased productivity. But because we live under capitalism, instead it's an opportunity for the bosses to take all that producivity for themselves and not have to bother keeping the whiny, hungry part of their workforce alive
(and that's before you get to the energy requirements of AI supercharging the environmental destruction that will kill the planet at such a rate that millions of those now disposable poor people will be killed off without much effort – so in that sense I guess it is win win win, for the capitalists)
youtube
AI Jobs
2025-10-23T21:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwB_yAjv6TEizxLdwF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzKAshimGLWuVQZSEN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzf0WPxMncseRFBRh14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxdXFiaFJQ_rrzBVup4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwsRdExgxH9FpvaxnB4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzJ3G-qqBIwLI8eZC14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgytauU6MVodQ9_zQvJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwjCyuYyuygPhAFDsF4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzUvunKInVY_XflIGF4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwxF9S_rioIN8J96sl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]