Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Your not an artist and do not claim yourself as an artist if you use AI and prof…
ytc_UgyH0sZsD…
G
Paul. Just saying. Already knew just from the first frame of that damn tiktok, i…
ytc_UgwfVdjH_…
G
Now what stops a person from instructing a robot to kill humans and blame malfun…
ytc_Ugy0mBSLg…
G
Thank you for speaking up! As a Designer/somewhat artist I am so worried about t…
ytc_Ugy-Uc4B4…
G
Next Suggestions: (Asking ChatGPT)
- *Is Alex O'Connor related to Alice O'Connor…
ytr_Ugzz63RRH…
G
I disagree with their ideas. Artificial Intelligence needs a learning nueral net…
ytc_UgigfobkZ…
G
Damn I wish they did this here in Canada. I would totes become a hermit.…
rdc_d2xe4zb
G
Remmember, digital art did to painters and trad artists back then what AI is doi…
ytc_Ugwtev-3z…
Comment
6:58 The fact this makes it into the news is kind of mindboggling to me. There's so many things wrong with this I can't even count the issues with this.
You're not supposed to expose private data to AI. You're not supposed to give it unrestricted access to anything. The AI doesn't even claim it wouldn't do this. The probability of doing this is extremely thin, it's just viral when it does happen. You're not supposed to touch production when you don't have to. When you do have to, you make sure that what you're doing is rehearsed beforehand and is executed by a trusted person. When you delete this drive, you wouldn't tell anyone unless you're making a security incident analysis, which this isn't. Also, people lie on the internet, it's like it's your first day here guys. Also on unix systems you don't typically delete to a bin at all. Also this could happen to a human just as easily.
youtube
AI Jobs
2026-02-06T17:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxQZxRL1SMpXH6Fiw14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxFFoWCw8BsuptZTV14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxuxBpclssYeS9IOmV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy49paJLGVjPAfl00F4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz2Y7xAmCJdxxtkea94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyD68JosGozdbbwkuR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwomLdD06-xcf1IUgR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx9vvc7Yu2ezbDlB414AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzSRuAojUX92mHkkHJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzOaDBy3rv1N0TuR054AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]