Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Just remember not to complain when your loved one is killed in a Tesla Robotaxi,…
ytc_UgzTQ-vP8…
G
I don't know why artist always improve their creativity but not logical thinking…
ytc_Ugz8AUYUd…
G
Fr and they’re blaming the ai when it was the parents/families fault for not pay…
ytr_UgxR-Fo3E…
G
Understanding AI replacing all human labour is comparable to a father doing all …
ytc_UgzL8yzEJ…
G
Wait I know you use nightshade to poison the AI well but what about your videos?…
ytc_UgxTmYcl-…
G
The AI apocalypse isn't roving squadrons of killbots. It's mass unemployment and…
ytc_UgyZD7kEI…
G
It's all fun in games until Waymo hijacks the neighboring Tesla's boombox mode a…
ytc_Ugxg6d2_t…
G
Bro the dude robot wants the universe and he wants to let you know he wants it🤣🤣…
ytc_Ugzco384Z…
Comment
This is a bit misleading. Cook County didn’t make its guaranteed-income program permanent because of “AI taking people’s jobs.”
The county’s own documents say the decision came from the success of the pilot program, the affordability crisis, and the need to stabilize low-income households. Most of the people receiving these payments weren’t in the kinds of jobs AI is replacing anyway — many were unemployed or in service roles long before automation became an issue.
It’s fine to talk about AI disruption, but attaching it to policies that weren’t created for that reason just confuses the public. Accurate reporting matters more than a catchy headline.
youtube
AI Jobs
2025-11-30T13:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyTfN1PIMLk1RL2UnZ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzbKmU2cGwDKYwbtTB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzrAFjms4yOzPjBSkd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw9VEoxsJYpWkbiZPV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw9Vo6v_fXfZ_26d714AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxzLefOBJds84LSahJ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwYiRseTZeMADKJtTZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwLXheIVFFMjCiW_Vx4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxNd-oMcctU1HwC0Kl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugy5EHTDnkZikkLUSiV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]