Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI should help people not replace them.
Why there might be some advantages it t…
ytc_UgxVVCD_t…
G
Ngl not everyone is skilled enough to learn or has enough money to buy art, crea…
ytc_Ugx1kf1Jh…
G
That doesn't bring in money, AI isn't being done to progress society but to cut …
ytr_UgzwO-bwk…
G
I need the background of this lady too, undergrad in MECHANICAL engineering work…
ytc_UgwBpdj0G…
G
How is this related to algorithms? No software capable of homicide should be lef…
ytc_UgzP8SJp4…
G
5:27 "AI is lagging in terms of physical dexterity" anyone here after the Chines…
ytc_Ugzo6WG_A…
G
AI is going to be used to ruin many men's lives, especially with false assault c…
ytc_UgwhDQiNP…
G
Easy solution. Invest into those data centers and the AI companies, then sell yo…
ytc_UgwSQxu_b…
Comment
Not sure whether I like videos like this is the right question. I’m following AI and robotics and defence across a number of sources. I don’t like it. Not at all. But yes Steven please continue bringing us these people who are sounding the alarm. On the interview, then. I get the point that we can’t pull the plug. But the consciousness point, surely, is that an AGI would have to have its own ‘will’ or at least some form of incentive, before it would cause existential harm? I remember another guest talking about “alignment” and how researchers are persuading the models to ‘like’ us. I’d like to hear people explaining that some more.
youtube
AI Governance
2025-12-04T22:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzMpJTR6IN0nzAZTSt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw8yeuVZziq0udddF14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxV-zuLDYOlr0cq4M14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz1edtxsYJVDqyPYfd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzzfAY8FVmreCImGbJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwKwAhmmFU6hzIVILZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzo3TJuB8QQ2pduegp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgwtirOv9ynks6-3qAB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwx_Y9VhcXPVAT7-yJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"sadness"},
{"id":"ytc_UgyFmxl94q1SzyzN7MB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}
]