Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I will never say that the speaker, who is an AI researcher, is going to destroy …
ytc_UgwNh4QHL…
G
In my opinion the way i see AI bros, the artistic part on their side would rathe…
ytc_Ugzh8JsEn…
G
AI is the agent that will force a painful evaluation of labor and capital and we…
ytc_Ugw5d_4oU…
G
Of course it is exaggerated. Some companies are still struggling with leaving be…
ytr_Ugxzifk4B…
G
I've spent so much time getting those channels off my algorithm that I can no lo…
ytc_UgxdRYyIN…
G
Thanks for making video! It is very frustrating trying to reason with AI users, …
ytc_UgxABn5VP…
G
The thing I don’t understand about ai is why replace artists with robots? Until …
ytc_UgzyNw7fQ…
G
I bet all the human jobs are being taken over by AI. The only AI i believe in i…
ytc_UgyvxQTET…
Comment
People don't realize how far away from AGI we are. Current AI projects are hyper specialized with LLMs being the only real breakout star. Predictive AI applications generally suck really bad in all areas they've been tested. AI for generating anything actually mechanical or 3D is non-existent. AI for controlling machines in the real world i.e. general purpose robotics motion planning - literally doesn't exist. I'm sure I'll eat my words on many of these functionalities, but many of them are divergent, not convergent, as in the case of AI for predictive applications. We still have a long way to go. Remember how self driving cars have been promised for 20 years now?
youtube
AI Governance
2023-12-30T23:2…
♥ 6
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzDyfwk9HibtxrCzIx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzAyproHIrSdizcfER4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzHSN-hxaeG5t9RlZV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyVXzl_QNFa5V0fmNJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw-vK8R4T_XfigaeTp4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw1fSxXCyw9zwMBLZ94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwfwbP9YkYtoEX0FV14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwhGshfD0JlhVBNaRp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgweuiImBq4PQVwaXXN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzVcc6qv62fXgPjK814AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"}
]