Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Imagine what this world would have become if Bernie had been elected president. …
ytc_UgxHntUuQ…
G
Maybe so, bust what do you think employers would want, someone they have to pay …
ytr_UgwEWKEg3…
G
I used open AI 2 days ago and it told me that one minute was less than 17 second…
ytc_UgzZ9qYof…
G
The highway scenario about not asking the animals before building also does not …
ytc_UgydDlneC…
G
We already see it being used in politics as a way to disregard bad behavior and …
ytc_UgwnBUYFc…
G
it's so obvious the tech isn't where it should be to be deployed but the doucheb…
ytc_UgzFJnL2R…
G
AI is evolving so fast, I’m seriously considering giving up tech and becoming a …
ytc_UgzDMCkap…
G
19:30 I’m like, 9 months late to this debate, but:
I absolutely cannot stand …
ytc_Ugy6VpE6k…
Comment
I'm glad you continue to discuss this important issue, but I can't help but notice both your guests and topics seem heavily slanted towards the "existential risk" end of the spectrum. It would be beneficial to hear from the other side (who I believe are a majority in the AI community) as well, who think AI is unlikely to pose such risks, but instead causes more immediate challenges like algorithmic bias, privacy concerns and ethical questions around generative AI. I'm thinking of people like Yann LeCun, Andrew Ng, Julian Togelius or Margaret Mitchell.
youtube
AI Governance
2025-11-26T21:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgxqmiN8JSgZOpjSgux4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugx7VaJ0f4fciCd9UkV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgzSaqkb-Mn6W3USVCx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgyHhp7wg1mjviXkSSd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgyCHX0l-b4Np2JtlTN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_Ugw5yS81M19wpsAthpR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgxXkCIQVK7u9960gPt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgzrdzLzWdUu0SyAkG94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgwnjB-9GKL-THzwuVx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgxdANNfWCGEqyOq7kV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}]