Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I am always polite to AI...but I never thought about why that might be. 🤔…
ytc_Ugy3BmlsG…
G
Not entirely true. If it's an "at will" state, like most are, then there's no …
rdc_ky16aqq
G
I just hope the cryogenically frozen (morons) will be ok in the event of an ener…
ytr_Ugwmrb9Qi…
G
You are all insaine and dumb that it has no end. Ya all allow this? Ya all suppo…
ytc_UgxkJV7B2…
G
the mention about clever algorithms not being AI is a bit wrong. It comes from a…
ytc_Ugy6rM1S-…
G
@ballom29 well yeah, people don't want garbage spamming their feed or clogging u…
ytr_UgzZFE-Xo…
G
If a humanoid robot is built that can weld at 100% efficiency. Then you would in…
ytr_Ugw6Vsbb7…
G
Shoshana is brilliant in bringing forth the Reality of Surveillance Capitalism, …
ytc_UgwbaDP7t…
Comment
Kurzgesagt, this is another masterpiece! The way you visualize the jump from narrow AI to the concept of an Intelligence Explosion and ASI [12:28] is both fascinating and deeply unnerving. It really makes you pause and think about the speed of progress and the profound risks involved. The comparison of ASI to a 'God in a box' [14:14] perfectly encapsulates the magnitude of this challenge.
Thank you for consistently delivering such high-quality, thought-provoking content that pushes humanity to confront its future. I love the historical context you provided about human intelligence too. To the community: Do you think AGI will arrive in just a few years, or take many decades? Let me know below! 🤯🔬🚀
youtube
2025-11-16T02:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy0jhJJqDnWC-lzoQd4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzaBmZXcsbea_lTrXx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzzdaMkAmQP0UmC2xN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx5yPvbayTrEKtw0Nl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxH9C_5NME0A_Qz8dp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy4QE9eeY8YdLeK2ml4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy4LCn__Tmd_IMOB4R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx1J9sSDfkjfpF51cd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwwm3R7aVVzO0UDCg54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyp6ZXk7UwhtpKd2np4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"}
]