Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Whenever AI can automate any other job: Artists, "wow, such progress"
Whenever …
ytc_UgzisbIIm…
G
to think that artificial intelligence is scary..is to not know what it means to …
ytc_Ugw1BUHke…
G
This video a very big over simplification of AI and is pretty inaccurate. Yes mo…
ytc_UgxKHyySX…
G
I know I appreciate your story and calling out these thieves using work of flesh…
ytc_Ugy30BSLc…
G
The level of snobbery and pretentiousness is over 9000, while simultaneously you…
ytc_Ugwy2BvJZ…
G
AI helps to write code faster. It doesn't mean it could be replaced with human i…
ytc_UgwXakC-6…
G
3:55 An interesting analogy I can make here to a different video game would be t…
ytc_UgxUvNQnV…
G
Great question! It would be interesting to hear Sophia's thoughts on that. If yo…
ytr_UgwrCvKtq…
Comment
Devil’s Advocate here: 1 in 6 is 16% chance, I’d argue we have been way more reckless. Next point, you wouldn’t have much of a channel if you weren’t playing on people’s fears. That said AI will cause great suffering but not how everyone thinks, the truth is AI (LLMs) are a take it or leave it tool, it only looms large because of the tech bros telling us we need it (it being a product they’re selling). Once the bubble bursts, the real damage will be visited upon us. AI is not the problem, it’s the people selling it that are.
youtube
AI Moral Status
2025-12-15T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwGCzQBM6B_4VSO-N14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwjU3yNkzmWRaJjf9Z4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgykZxNMUbv4jGCt82d4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzuJ-cpOh_FvZ9295p4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw2CJh8oPw9TgFp-Dp4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxElQDcm_NAUNqlM2F4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwoYN1GPqrbFN-uCs54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwSSBWsxmZP9XxuGOl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw0Hm35ZZnDJwLTHX94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy9iityQ0p0S42Mqut4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]