Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
1. The „AI” is not doing anything for inspiration, because it is not capable of …
ytr_UgzZu3hmL…
G
Artificial intelligence was made the day the computer was invented…. As long as …
ytc_UgzP6PQrg…
G
I really dont like "ai" it makes me. My friend actually tried to defend it ai ar…
ytc_Ugw1RYrIi…
G
2:25 actually that doesn't work here either.
Inspiration is one thing but AI ar…
ytc_UgzAwR3-H…
G
My recent interactions with artificial intelligences has convinced me that they …
ytc_Ugx5UrvNN…
G
Recommended reading: The Economist, February 7, 2026, "Artificial intelligence …
ytc_Ugwse5W5b…
G
We should absolutely never settle for "UBI" in the face of existential threats l…
ytc_UgyYumB7G…
G
The government prints whatever money it needs it doesn't need our money, taxes a…
ytr_Ugx_KPG1N…
Comment
This boils down to humanity’s fear that they won’t be the most intelligent entities on earth. This is a deep seated fear supported by thousands of years of evolution. Max and Bengio have a burden of proof that such an intelligence presents a sufficient risk to the safety of humanity to offset the opportunity cost associated with stopping AI research, where that research could massively benefit the world. I don’t think they successfully carried that burden of proof in this debate.
youtube
AI Governance
2023-06-27T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx3xw4e8ocKyU_ZQVB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwvFU5BKq0WWt53Omp4AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyKKv9bE8upTa5Sbyl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyVKsTelwk9yglYzrV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxPb-0iY4pi7iqejet4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzHCTk_4zxgU8wyWEp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzzypf_v-asdoe7_Nh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxIX-TJyadxzvqSmI94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxfe8sR2whVY9uI4Jd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwETgxNS3mPOdvrjtV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]