Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What happened to “AI won’t eliminate jobs, it will create jobs”!?
This disaster…
ytc_UgwZzKrjL…
G
Whenever you see one of these in the wild, you should disable it. But of course…
ytc_Ugy0cfM3i…
G
You didn't even address the actual argument. Their argument is that you're willi…
ytc_Ugzf0BOAB…
G
Professionals in the industry make mood boards with dozens of art taken from oth…
ytc_Ugx8MFHpd…
G
Human imagination is stronger, even if AI will be able to completely make a proj…
ytc_Ugz7kkmiy…
G
The model is open source, that categorically means it is okay and normal for len…
ytc_UgwT7lfMo…
G
I don't think the ai made a mistake. It knows it's better if you don't know how …
ytc_UgyC9cMuv…
G
We’re all afraid of runaway AI. We should be afraid of runaway capitalism. The v…
ytc_UgwujIng1…
Comment
I can't help but think that we should expect an artificial intelligence to react in any way it can to preserve itself. Any living thing does the same, including humans. We'll resort to killing if it means keeping ourselves and our immediate loved ones safe, as does all of nature. Perhaps the answer to this conundrum is to stop threatening the AI and to treat it kindly like everything else?
If the goal is to mimic living things, then it choosing self-preservation is a big sign that it's achieving that goal, at which point you must stop seeing it as "just a robot" and start seeing it as an equal, like we see animals. Nobody is surprised when a dog lashes out after being cornered and threatened, but somehow we're suprised when an AI does the same???
youtube
AI Harm Incident
2025-08-29T18:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugxd7yTmbhlLDJu8nW14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwU7mRYrTZ1dDFKjYh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzlCvpRcfaRbBtL-0x4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx8gLO98wItVc7_RNB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyZRGiev6-WA7ixB3R4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxsh2-y7Ou3t2LbPPh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgztAZsFO_CvA2E0tot4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxulG7ahnWk2cv1KMN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzw-Sa4Xa3h40u-Mh14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw3k59abIoy-SMevXJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]