Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
-Sell dictator big weapons.
-Find large oil reserves.
-"Realise" dictator h…
rdc_dsbdw8v
G
Imagine sleeping with her and doing sex , suddenly electrocution to death becaus…
ytc_UgxW35P5Z…
G
The most advanced Ai accessible to the public still has been unable to produce m…
ytc_Ugzq5msME…
G
We live in a simulation. 😂
This guy is completely off his rocker. Honestly, this…
ytc_UgxlTJWsn…
G
The oft-quoted "don't blame us, it's the algorithm" trick. Social media are alr…
ytc_Ugxc7acRG…
G
ilove robots but robots with ai technology? selflearning and they will learn fro…
ytc_UgxJBS0W8…
G
Artist mentality is just pityfull to me. AI is just a tool they'll never outprod…
ytc_UgzTdIowX…
G
Step 1: Unionize and seek edit ownership over your business/factory/etc.
Step 2…
rdc_glku64a
Comment
To quote my former boss at UHG Optum Technology, said bluntly in the wake of Brian Thompson’s death and the later disclosure that he aggressively pushed AI to maximize profits: ‘This is a for profit company. The less we pay you or pay out in claims, the happier the shareholders are. If you want a raise or better benefits, go someplace else.’
Since then, the company has lost roughly half its stock value, rolled out sweeping AI driven systems, shifted large portions of its American workforce to South Asia, and cut around 80,000 jobs in the last 20 months. Turns out treating people as expendable inputs is not the long-term growth strategy they thought it was.
youtube
Viral AI Reaction
2025-12-12T23:1…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy45zpVUEJJfblnwAB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxuSblCZKSQNpSd9TV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwV7VlpSXL0UeZPAlt4AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwqzzAAKEFZFZ-Dccx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzJjQt4rLyOzE2cY4V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxPIrHlU52eLdbeq8t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwNdxLRgBwYx34zUuV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugylk_p9gSioMVM3geV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwq4S4kS7z_Q4RB0xV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwlRRZCl0LjPsmCC1R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]