Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"Anthropic's Custom Claude Model For The Pentagon Is 1-2 Generations Ahead Of Th…
rdc_o82h8s2
G
This video is exactly why "auto pilot" and "full self driving" shouldn't be allo…
ytc_UgxHbnLoE…
G
I would just train AI on all music, and just listen to the results myself lol no…
ytc_UgxE-_LeI…
G
Autopilot implies the car can automatically pilot itself. So stop blaming the vi…
ytr_Ugx5OX8zV…
G
Its not the AI we have to fear....... “He who controls the spice controls the un…
ytc_UgyJwACoq…
G
Tired of the "automation won't replace humans, but work with them" nonsense. Ent…
ytc_Ugx_0eWvH…
G
These AI has no real intelligence, nor knowledge. They are like a big sophistica…
ytc_UgyQt_TzJ…
G
its almost like AI is showing us that we need to stop federally protecting hate …
ytc_UgyWQomyL…
Comment
As of mid-2025, Workday is facing a major collective action lawsuit (Mobley v. Workday, Inc.) alleging that its AI-driven applicant screening tools discriminate against job seekers based on age, race, and disability. The lawsuit alleges that Workday’s algorithms act as "automated gatekeepers," rejecting qualified candidates from protected groups before human review. The case is in the discovery phase and the Judge ruled for collection which allows more than one plaintiff. Time to drop your name into the lawsuit.
youtube
AI Harm Incident
2026-02-12T12:5…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgyH8LGHtOnBIKDSPSt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_UgzWXkAjtsnOrE-TkxZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgyCEwzoCN0aNwtk5Cx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},{"id":"ytc_UgxifQWVzouLTi2CQNt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgyHOtyIr2bcF3vY27d4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_Ugzw6Wg_J5bfcgPFpI94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_UgxWhuGgxRNfCcbR4_l4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"resignation"},{"id":"ytc_UgzJ5KLVVXHq0tMsow14AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"frustration"},{"id":"ytc_Ugw52dSsJRgqEVpuVtl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},{"id":"ytc_Ugw1nY_NGsOU2SEk7ad4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"approval"})