Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The AI we have is dogshit and the world we have is NOT built for us. It's built …
ytc_UgyP7bGbD…
G
AI is kinda cringe ngl. Well, the people who are all for it definitely are. Woul…
ytc_UgxTQ3ITW…
G
Great series. One small correction on this one tho, (and hopefully I'm not repea…
ytc_Ugi-CMHZ6…
G
Somebody with wills to keep drivers behind will cause these driverless trucks to…
ytc_UgzXW1lqN…
G
Nah I definitely felt frustration coming from what it said. AI will kill us all …
ytc_Ugz2AF_2f…
G
As a black person, as much as I can understand the stance of fear of the unknown…
ytc_UgwPkgJz0…
G
At lleast there is a W from copyright that the government actually understands t…
ytc_UgwMzBRnx…
G
This makes me extra happy because it returns us to the era of abstract and unset…
ytc_UgxHiCS-d…
Comment
Why can't someone get AI to write a solution for world, hunger, homelessness & poverty? Humans did ok prior to computer's coming into existence, maybe we should just go back to living without them? Fascinating episode AJ, thank you for your excellent as always content, this is one of my most favorite channels. Much love, Roxy UK :o)
youtube
AI Governance
2023-07-07T23:2…
♥ 45
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxJYslo1mVALnqXwq54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwiWZLBjV_muMpuRXl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyATbDqi6oD_QPzHPF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxwAMTlhweHL0Ygh9J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgweztkmWerOBjamphh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy9jVjY19ARFX4s3Cp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz8DnIB6NYlpVoGonp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyxXVJzcDtJgesFyVR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxfHarxMm_TNrQzp094AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyr49p9t03PgV2xFKN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]