Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
He is right
Ai will definitely make people loose job but it will create new job…
ytc_UgwEvahZu…
G
Not here for the ai part, but the talent argument is valid, if you take 1000 peo…
ytc_UgxaqLy0O…
G
The amount of people that just view correlation as causation truly is infuriatin…
rdc_no2yb73
G
that’s really stupid considering chatgpt has all the information in the world an…
ytr_Ugzhk9nWv…
G
It's not the algorithm itself that is racist, it's a program. It's the writer of…
ytc_UgydD1xfl…
G
Not for top 1 percent they don't like humans there about start shooting us we're…
ytr_UgzhZyzuR…
G
The people who religiosly defend using AI for creative tasks are the same people…
ytc_UgzFkBxg_…
G
He says AI will improve education, but why bother? He also says that it will mak…
ytc_Ugxptfr3Q…
Comment
Does anyone else feel like he's losing his mind like I am? I have a feeling that the world is actually ending. I'm imagining things that would only happen in science fictions, and it actually feels like they are more likely than not going to happen. Universal basic income seems unlikely to me because the whole idea that the few ultra rich will share their AI wealth with the masses sounds implausible to me. I imagine a world where the ultra rich are gods, and everybody else are slaves. I feel like I'm losing my mind. I've gone through 2000 and 2012, when the world was rumored to be ending, but none of them feels as real as this one, and I've never felt similar anxieties.
youtube
AI Governance
2025-12-18T02:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxx6yaOTPDHpCjsKD54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyr7gKvaCSZpKNjMHN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyS88hZQdg82TT0HbN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxxlBGWYk3O35EYOJB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxV-V9izisVehB3ubR4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgxRsrTPmmTZGmwx7054AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw_NPBeEknQNsoEW8N4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzUoy1Nh6MxyGWZIt14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy9CNa-FCkvldq5AS14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxUpgIjz5neP1u-BqJ4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"mixed"}
]