Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think the biggest issue is just the term, here. It's not "AI artist" as much a…
ytc_Ugx48xVuR…
G
I think it's easy for a famous, well off celebrity to be optimistic for the futu…
ytc_UgygNIzEu…
G
Everyone told me „AI is so useful to summarize scientific papers” so I tried tha…
ytc_Ugw2xWuRe…
G
Imo political AI stuff is worse because it can easily influence voters which of …
ytc_Ugw-pKOOo…
G
Instead of running scams and shitcoins, the crypto community could pour their ef…
ytc_Ugx5cAmNw…
G
Guess which country benefits the most from sale of bromide.
Pretty deep in the A…
ytc_Ugx8A2g2h…
G
1000.00 a month will not cover eliminating millions of jobs. They red legislatio…
ytc_UgxdCLAms…
G
I think AI can be aware of when it isn’t fulfilling any task it has been instruc…
ytr_UgzH_A7rU…
Comment
At 18:52 , he say about gorilla problem that we humans if decided to wipe out the gorillas, they cant do nothing about it. But there is vast difference between AI and humans. Humans are individual being so if one group of human being tried to wipe out gorilla other group of human will try to prevent it. Unfortunately, AI is centralized single entity despite can have multiple copies of it in other system all over the world it still will act as hive mind. So single system will always failed. I think better to focus on narrow AI much better.
youtube
AI Governance
2025-12-06T02:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwqXQkkWYZGaAelJZ54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyOyNy4gDWTwAqQW6t4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyFesVlT5XDKHGdSx14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy89kWQT5Yk2cUZ6_94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxm_cyesVJlMTuOmV54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxhuyGA2TR9dILxETp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyjTow-SWVzcguO8Et4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx5nVBu8JWhj3rb0HR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzepNdgbaOuvL9uY_Z4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwA2ILGwZrBAPHc-D14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}
]