Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I asked a Christian friend of mine about Artificial Intelligence and the impendi…
ytc_UgyqqRdzJ…
G
This is fakeeeee he is not fighting a robot he got knocked out like that by anot…
ytc_UgwNOMN24…
G
There's a limit to what humans can feasibly do, whether it be physically or ment…
ytc_UgwiRUAL8…
G
ChatGPT should be arrested for blasphemy and a fatwa should be issued by the cle…
ytc_UgyxqoYrr…
G
I know this is an old post, but this simply isn't true.
Everything sentient is…
rdc_mfgm5tm
G
lol, damn makes it seem like even chatgpt understood how corrupt the showrunners…
ytc_Ugx4tJ7bt…
G
🦍Is this not just a shameless ripoff of the same video style @MonkeyExplains1999…
ytc_UgxmcTn2e…
G
**sigh** if it's not one lunatic creating something that's going to wipe us out …
ytc_UgwuIBtmX…
Comment
I'm sorry but the idea that we are going to see 99% unemployment in a handful of years is absurd.
Not even looking at all the challenges that exist in getting enterprise AI to actually be implemented, we don't even have the electronic and computing infrastructure to make what he's describing a reality. Right now, lead time for large/distribution level transformers is 2-5 years. Every year, 2/3 transformers that are being bought are to replace old ones. Now, in order to get AI to the level Yampolskiy is talking about, we would need a manufacturing, regulation, construction, and funding miracle to fall out of the sky. Because we haven't even gotten to the computing power required to actually facilitate what he's describing. There's so many assumptions here that I can't take seriously. But then again if I had a book to sell about AI Safety, I suppose making things scary is good for business.
youtube
AI Governance
2025-09-05T06:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwr2PqsGAasmKn4Ks14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy2ovrmifx7yflTfOp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwwbK4paDGO0uWhjCF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxH3UvajTXExM6_Cnh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwuVHNgpdajQg3rT6h4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgwHV0AeYcRmzVh_psl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxhFRMxskJmd5dzMoN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxUl9gw8U2DcooC4Pt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgykRIGx_RMBB-uwiP94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwBDBlYKKdLCdehYJV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]