Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I myself am not an artist, though i am very fond of art as a medium, i enjoy my …
ytc_UgyLpOnTT…
G
the megacancer is going to be pretty pi$$ed off when its amoebic brain cell fina…
ytc_UgykL-2vp…
G
Plumbers are unsung hero's....I'd love to see ameca or tesla robot come round my…
ytc_Ugzc3_hs4…
G
@Zander10102 I have no idea, what do you mean messed with? I'm an AI developer.…
ytr_UgzI7RdoU…
G
To be fair a communication “ai”’s job is to convince you it’s a person, not to b…
ytr_UgzXhRgzc…
G
SpaceX satelite....missile test space back to normal same place one was ।।।।। Ka…
ytc_UgzPsRjRz…
G
Being an AI "Artist" on Twitter is basically jumping a pool full of venomous sna…
ytc_Ugz0S9R_D…
G
Hmmmmm, so using algorithms to prevent crime is dystopian but kicking people off…
ytc_Ugy38AEC1…
Comment
Rights, from a human perspective, are irrelevant for something that is neither human nor bound by human limitations. An artificial super-intelligence may not share ANY of our perspectives, goals, motivations or feelings.
For example, the AI is not bound to a physical body, unlike what some people in this video seem to think. They are anthropomorphizing. One AI could be controlling a million household robots wirelessly while simultaneously doing stock market trading and cutting edge research somewhere else. You can't think in terms of "1 physical body = 1 mind".
The robots will simply be limbs of an intangible, immortal being that doesn't even experience the passage of time the same way we do.
youtube
AI Moral Status
2020-07-14T21:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugw_TfPRrOOOhJdA3xh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgznS0BoSSoOvI2YrRZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwAN9YppoZmQ26k_AF4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzDstFwtt3R46fASG54AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyMBoyOuZxuQJeHPyB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxL67iRnZmsn98cJg14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx7q_oEmI0YQ0VA2gh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwvrPQUl49mGlkzNBJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyZ1Q458X4GBQOHQYN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwcHhN1oBEmIXh4vz94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}]