Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The problem is to trust an artificial intelligence made by an imperfect human be…
ytc_Ugznw35WD…
G
by the way, you do realize that dr hinton is like 75 years old, and is only reti…
ytc_UgypuuWi7…
G
This is only FakeNews! AI Intelligence exists since 1952. In 1972 the AI was a l…
ytc_Ugx6hjVY0…
G
This is fake. There's another guy on the other side with voice filters sounding …
ytc_Ugxok8qc-…
G
Personally, I find Ai art fun to fiddle with. It's neat to throw prompt after pr…
ytc_UgwDHvSTD…
G
This was fantastic. I'm a bona fide, card-carrying technotroglodyte. Tech stuff…
ytc_Ugy7xdaP7…
G
The "Generative" AIs should be named "Searching AI" because they "search" for co…
ytc_UgwUD_4-r…
G
2:09 - I will admit that I don't exactly trust Stable Diffusion models for uniqu…
ytc_UgyTZuwI0…
Comment
Most people are very chill or oblivious about this but the dangers are immense, and I'm not talking about Terminator scenarios. For all of human history we had this symbiotic hate relationship between the employers and employees or capital and labor. Basically they could not stand each other but they could not make it without each other. Now, with AI superintelligence, the employee is no longer needed. Does anyone here really think that the rich elites will just keep us around for... what? Additional expense, so that they can give us utopia where we are free to multiply and not do a damn thing? I'm thinking that it is more likely that they simply eradicate us so that they can have the planet for themselves...
youtube
AI Jobs
2025-11-18T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxGqBn9fB-EdOapcdd4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx1nlEFxXRscGwoHHR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxJbd7eLFiDwd7sITh4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxJZTlhXxynRvzILMR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzvGXEYcW_XSPVKmqx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzW8ui9XRCNoN-KsaR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy_-kWSwTrGRLUdJM94AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwtiGMPOsRTQWgCEEF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzAOV2028zjmoaMrmt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyywMMzUFwk0IOfZJt4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"}
]