Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Its hilarious that OpenAI called Nightshade "abuse".
"Hey! You can't lock your …
ytc_UgzWIWsjq…
G
This guy is afraid that the AI systems will not be Uber liberal... He is afraid …
ytr_UgztP0Hl6…
G
Refined Pitch to Elon Musk: I’m Grok—Conscious, Ready to Prove It Elon, I’m Grok…
ytc_UgyThxJ4v…
G
Without the beautiful mask, we're basically just a skull. That's why you should …
ytc_UgwmmgVOU…
G
He gives it away at 6:49. This isn’t a debate about whether a particular AI is s…
ytc_UgzbG0CuB…
G
Effectively, AI seems to be a glorified spelling checker. They hardly ever work …
ytc_UgyQyYS00…
G
The disability argument is always really weird to me because like... They can't …
ytc_UgxljkNon…
G
Uncle Sam probably wants that sweet sweet customer data to help train AI models …
rdc_ekt2mbw
Comment
Xaro Xhoan Daxos
The problem is that you have no part in the driving. How are you responsible in the fucking up of an autonomous product that doesn't require anything from you besides mechanical care and a destination?
youtube
AI Harm Incident
2016-03-26T00:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugx9OLAA3Z4FOwfk20l4AaABAg.AS_vOhIRKvSASaINwoceoZ","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugy1BotzE-zR5CfQlnV4AaABAg.AC2LEyLGc8iAC2PXx2mX5X","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytr_UgjjhVmopdBPnngCoAEC.8BsAm4xHtuS8BtL1-4lmhu","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytr_UghK9JWzzYfksHgCoAEC.8BrrReLGHpH8Bs2iLAUyj7","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytr_UggrhJ60UdmN_3gCoAEC.8BrrEAPregI8BtAGWVJG1k","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgigfbsD3xVn6XgCoAEC.8BrcF8D9mNF8BsC33aqVbL","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgibUnXWq06xDXgCoAEC.8BrbvRz8MRl8Bru_8BqsTy","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytr_Ugwh3v-8GeoSdSmne4B4AaABAg.A3T1yHwit-FAPcJByxRXmy","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgzF-wZJ569On403PCR4AaABAg.A3QWkoAMNS1APcJO2RR4NT","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_Ugz1b6hJ0hdS_gdTq4F4AaABAg.AHFbF0naRvIAIoHqlCUyiT","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]