Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm wondering how long before the AI owners are legally required to empty the AI…
ytc_UgzWEFCuK…
G
in my opinion people like him underestimate how illogical mess is something norm…
ytc_UgzyFuVvf…
G
Awesome! As a student, I like to use Undetectable AI for my essays. Their essay …
ytc_UgzIStQy6…
G
Ai will never be called to the bar and practice as a lawyer or a judge. It can i…
ytc_Ugy4rgBi4…
G
A drawback of Tesla's FSD, and autopilot, is that the only two options are syste…
ytc_UgypHeJg7…
G
I don’t understand why AI doesn’t come with disclaimers so people know they’re t…
ytc_UgxEz1K33…
G
Sam and all other ai ceos and their companies can go to h3ll for killing creativ…
ytr_UgzSPTpPo…
G
For someone just learning to code, do you see a way someone can learn whilst usi…
ytc_Ugxx8w2I7…
Comment
Sorry, folks - genie's out of the bottle and morals jumped out the window ages ago. Even if AI is "trained" to do no harm, what is its idea of what "harm" is? If it determines money's the root of all evil, then it can easily come to a decision to shut down our bank accounts. Thankfully I'll be dead in about 10 years, so I'll only see the edge of destruction. It will be interesting, though. Tech is so cool, right? *eyeroll*
youtube
AI Governance
2025-09-05T17:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwKGnzMHnleia871814AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxpSu5UcekY742ReAh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwo_tela5K1rW-kohF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwV-hIrOiZnuuc8jVR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw1S97C91lSdxRazsx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyqAK-mtI5E-8NAFDR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyl7qMciL85oEtctiN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzNACn4Iw_pyWFcpC54AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxu5j_BoihACLwCzZd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgydKhA3Tmgf5hDAQa54AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"}
]