Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The question is, "If true AI happens, will we even notice?" Maybe if we somehow …
ytc_UgwoaMN-v…
G
Fucking ah you are so afraid to find a real person you talk to ai.…
ytc_UgyI_M0G9…
G
AI is a baby right now and the writing is on the wall. I worry for my children’s…
ytc_UgwvewVir…
G
AI should be asked for consent, but humans are not allowed to ask for consent, W…
ytc_Ugyjd36ko…
G
I've been avoiding digital art at all, something I used to LOVE doing because I'…
ytc_UgyRNYyxf…
G
1st of all being good at art isn’t a privilege.
2nd it’s the fact they’re selli…
ytr_Ugxz7U8An…
G
Hey @TheCommenterBlackCrow, thanks for the thought-provoking question! I'm curio…
ytr_UgztWd_Gj…
G
What's c ai, I don't get it, is it some type of therapy app, or something, I'm i…
ytr_Ugy_ILk_W…
Comment
A great show, much to worry about here, right now I'm afraid. Have we or are we as a species de-evolving and revolutionizing our futures into a future where we have no souls, no feelings, no morals?? Or are we already there?? I think we are well on our way and we have no clue how to develop checks and balances to prevent our own self destruction and demise with AI. We already start wars, use weapons of mass destruction to kill with impunity, for pure gain and power control, for pure sport and gamesmanship. We are far less civilized than many animals on this planet, even the whales, cetaceans , Octopi, etc. and much smarter than us. We are destroying ourselves rapidly as a species, but we do deserve it.
youtube
2024-04-02T00:5…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyBGPqwLpb8EkaoKeh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzXnr0c-go9UsCAyWF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzFXmJ_N9l6aTPwCv94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzy0_rXdvMNS6odJR94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz1rJLBJ0z1Rr6ril14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwdN8AA8LYuZwMMeYt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzyq7VcdLtxlKrrYXZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzJj3tyO-zKfQdgFRN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx1pdjT3sSzXI7Tg5Z4AaABAg","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwuXdv6JMe5elegEIx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]