Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Video games has its own problems. There is a life cycle to games and then come t…
ytr_UgxFnvqKr…
G
An automated truck can’t fix a blowout on the side of a freeway, there is still …
ytc_Ugw2R83ft…
G
Ah, well. AI hasn't replaced illustrators either. I still get jobs, because I am…
ytc_UgysLejW7…
G
only reason i like ai image generators is that i can come up with anything i thi…
ytc_UgzRqRgQV…
G
We all hate AI but none of you people seem to realize that this channel is AI sl…
ytc_UgxL8UrjU…
G
@cedricdelsol9320 yep, those videos are great. Unfortunatelly, there are so few …
ytr_UgwfnOOG0…
G
AI is such an impressive technology I am wondering Bruno knowing you as a "heavy…
ytc_UgwEhu5cr…
G
Essentially it sounds like what is happening under these copyright laws is that …
ytc_UgyCjdFqP…
Comment
Towards the end of the video. If you put 10 people in a room, they couldn't agree on what a good or ideal future for humanity would be, hence an AGI would also fail at the task. Different cultures will have different views on the future of humans, add in individual peoples wants/desire/needs and you'd never get anything close to a consensus.
I suspect humans displaced by robots and AGI and therefore without purpose would produce a world filled with war, and civil unrest. For a frame of reference, look at what happens to the youth when too many of them are unemployed, they riot.
I think it's naive to think that China only has altruistic motives when it comes to AI, they may say thing, I strongly suspect they doing the opposite.
youtube
AI Governance
2025-12-05T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugw2gwt2wtIDpWAdYmh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzlNicjP9oLwrQcufV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwgokjebpqTJ3LbgZB4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugyi4u9DZVz67FKtxM54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxlQh0gxU9UQWqGFGR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxMhR0PaTWCXwUmJUN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxL8c_NHmXKZnDfRSp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxzIsjX4J3eI1BtgJh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz1XAfFy6YgrjXknOV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxofVkH1qe_IlAqEiN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"}]