Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I can barely trust my doctors and I am not trusting no AI chat group. That thing…
ytc_Ugx48bcH9…
G
I mean in some way by pulling data from all over the internet it's kinda represe…
ytc_UgyyhMA-R…
G
I think this conversation is futile because the people who like AI "art" do not …
ytc_UgyTer5Zz…
G
I mean the ai art isn't. Art. It's more just images for people who don't care f…
ytc_UgwWvuGxm…
G
I’m happy as long as AI pays our bills, keeps us physically and mentally active…
ytc_UgyCByXdm…
G
@DanPLC Humans do mistakes ;p It's only a beginning of self driving cars. In 20 …
ytr_UgxXozgUW…
G
I was going to go back to school. But then I realized my wife has student loans …
ytc_Ugy8D1Hk9…
G
Nothing ever happens that fast. Learn trade skills. Learn to use Ai as a tool. A…
ytc_Ugzedc489…
Comment
Some of them believe in this thing called "Roko's Basilisk." The idea is that an evil AI is inevitable but it will be benevolent to those who created it, and will try to actively kill everyone who didn't. Why it makes sense to create an AI while thinking it will kill everyone is beyond me, but there are people within the industry who believe it. I think it takes a special kind of arrogance to actively make something happen because you think you'll be spared if you do, but that's where we are.
youtube
AI Moral Status
2025-12-20T18:1…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgyJn43x5FTaZOcpb9F4AaABAg.AQus1lQB9gJAR1G0QIoVSs","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxzYb1OQkbEBvhHF614AaABAg.AQsCHqUeHBwAQvFLnT7o5G","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwlhaVp9GabDdGwgZd4AaABAg.AQrspJ6tmaSAQrvWMmLG_K","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugw6xwjArXVoZ3R8gOB4AaABAg.AQrTl1_A9MJAQrW8V8mExj","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxQDg74duZmCE1M3KJ4AaABAg.AQn_BPrzdymAQndNxn63UM","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytr_UgwyT013V4Be3OifIL94AaABAg.AQnTnzC3pfPAQnV3ylqGc2","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytr_UgwICutqsHEILkIBKfh4AaABAg.AQnQ8Al7C38AQvi7XC5xc6","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytr_UgwICutqsHEILkIBKfh4AaABAg.AQnQ8Al7C38AQwyyI47rsv","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytr_UgwICutqsHEILkIBKfh4AaABAg.AQnQ8Al7C38AQyBaRrLqQq","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytr_UgyRNBv2JguQ0NS9nH14AaABAg.AQnEF5Ud18cAQnF9nIedQJ","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"fear"}
]