Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The worst thing to think about is that if we're succesfull in creating AI with e…
ytc_UgzPrVsnN…
G
So all 7,000 people getting laid off - who for the most part are the average wor…
rdc_czla3uv
G
You can actually do it with AI. Make your own AI work in offline mode. No filter…
ytc_UgzJlkn9n…
G
Heheh can we call AI artist ai prompters/ai generators as they didn’t make it th…
ytc_Ugy1z2CQK…
G
look where we are now, AI models becoming more and more optimized to run locally…
ytc_UgzajySVa…
G
Maybe AI will help the world getting closer to communism. Because we don't need …
ytc_UgxZsahZq…
G
Bro that is not the goat he just thought he was good but he is trash he thinks a…
ytc_UgxwKYe89…
G
The WEF with Ai will be the most dangerous thing humanly has ever been through. …
ytc_UgzcKrYgo…
Comment
I think the leap from "AI will cause a lot of harm" to "AI will kill us all" is pretty foolish. It's been the fallacy of humanity to think an apocalypse is on the horizon for millennia. We've thought a collapse of society was inevitable from sin, books, the automatic loom, steam engines, electricity, drinking, nuclear weapons, the internet, pandemics, etc etc. All these have caused harm yes, but have been survived no problem. "It's never aliens" -> "it's never the end of humanity"
youtube
AI Moral Status
2025-10-31T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwKmmPGxX0kbQuFEa54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxMBNzPv0YV5wk26Ll4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyQBf2-ySXEmEPDvGV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyU52dzUJ0cP6uLeut4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyzUj23QPCSQQm3bkJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwznKZMqydHEd20M0x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy6fUSAOw28Pw25Lrx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyfqT-dDAHuv22h8fl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwn4J8GVJfdW0tbAgN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzlPWOP0shh9ZTXadZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]