Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI isn't a tool. If anything, it's a service. You're requesting something to do …
ytc_UgwOHVfr6…
G
Yet another manufactured crisis... "Hey that guys AI is risky! but my AI is not …
ytc_Ugyan1fN2…
G
TO "I ChatGPT'd this" "I made this with ChatGPT" "I asked Grok" I WANT TO LIVE..…
ytc_Ugz_Xqol8…
G
This guy's horrific. I love AI, but not when used like this. Manipulation, false…
ytc_UgzNJa6jB…
G
I just wrote several paragraphs about my life, right now and how nothing is maki…
ytr_UgxK-3P6D…
G
I dont understand, what a point in ART if it made by AI 🤷♂️ and people like fut…
ytc_UgzO2kSSU…
G
Ai isn't human or sentient, so idk why we even consider it art. Illumination was…
ytc_UgxZ86uWF…
G
@_B_E I looked it up. The copyright doesn't cover the AI generated elements of t…
ytr_UgzPMpf7e…
Comment
This is dumb. Let's assume they do use facial recognition. Ok. Fine. I don't know of a law that says that this isn't ok. So. Someone comes in with an ID with your name on it and steals stuff. Now, the evidence points to you (with or without facial recognition). The cops agree and charge you with a crime. However, innocent people are charged with crimes all the time. That's why you are innocent until proven guilty. Now you're proven innocent. Great. This is how the system works. If you believe insufficient evidence was gathered for a charge, sue the state. Apple didn't charge you with a crime, the state did. This is just some trying to milk money from a company.
youtube
AI Harm Incident
2019-05-07T11:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwlIWNWPzIaBuzLZwJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzLhug9XWjlFMuTs0d4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyykhAhtI349N5W98x4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyn6nI-bXAD09R-qhN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyBM-U4OeWIhxDzCvl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxX_MRvNKDR8MlmfRd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugz04Rj7cBZezraDOkV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzH7ZqqvU4e8hZHNXR4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxTEkNXUhvf5LiAOQd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxUlMyMwIESfpoEC2p4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}
]