Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’ve so many of these anti-ai art videos and they never (ever) seem to grasp wha…
ytc_Ugxpk4k2D…
G
If people think humans are capable of achieving such a high level of ai you have…
ytc_UgxVrR65m…
G
American army / navy ect. Is far past this AI lol other armys are starting to ge…
ytc_UgyZVqFlf…
G
wow, just the intro clip is insane. "if we give these people too much power". We…
ytc_UgynPymqA…
G
The irony of an Ai slop channel giving us news on the Failing Ai industry…
ytc_UgzWwofBe…
G
AI is concerning me... an AI company said this is doubling in power every 3.3 mo…
ytc_UgxwUpXWL…
G
A.I has no feelings or moral compass and it will ALWAYS become evil. Its also pa…
ytc_UgysfBdvk…
G
I think the key to a utopian society (in which AI safely exists), is to teach ev…
ytc_Ugzq8l3DB…
Comment
It will lead to a lot of violence and complete destruction of society. Of course those who own the machines (or better: those who control the machines, ownership in the legal sense will be irrelevant) will not equally share the benefits with the rest of humanity. That's not how humans, especially this type of humans work. It will lead to a level of inequality never seen before. It's not only about control of workforce, but also complete control of information. But in the end, physics (i.e. violence) always rules, even against AI (the machine owners can only hope to develop their robots to protect them and their machines fast enough). In the end with all the bases of society destroyed humanity will enter a new dark age
youtube
AI Governance
2025-12-25T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxhPPKZY2KTp88jVPN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy8sfFtHfTKqK-gO4h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugzs1TP6JhqHwmMW4gN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzNobJa9Q2YPODre954AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxKSwSo7KesN6UouIJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzYzV7CHoJr0Vq34YR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy10c9rFGpEIWyZn994AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyJZoZKzRL647WxmZB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyQkMsvgOPuFd9HUR14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyxwntR-cQyxP8esnR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]