Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
the AI is just using the intel was given it's doing its job we need to make chan…
ytc_Ugwy9Gb9k…
G
In 2014 I was in a paid study to participate in "virtual human" therapy sessions…
rdc_j43pi2i
G
Robot like sophia is even more wicked, they'll do anything to win your trust, yo…
ytc_UgzhQru9f…
G
The call center analogy… the AI agent being “too busy” for an unnecessary call i…
ytc_UgwWIeyfR…
G
AI isn't plagiarizing art. It is creating new art. The original art was just use…
ytr_UgwzHJqRb…
G
Even if art does take time I still think that it’s way better then using AI bc i…
ytr_Ugzt2HmtX…
G
Well an LLM isn't something that will be conscious, but there are things in deve…
ytc_UgzwEjdI5…
G
I'm all of a sudden imagining myself with a robot playmate in bed with me. Pleas…
ytc_UgyfhgG33…
Comment
Since humans understand and feel suffering, we have a sort of ethical mandate to expand the holders of rights and liberties. This obligation should apply not just to mundane society, technology developers + consumers, and labor sectors, but also to the halls of power -- like all Superpowers, all Governments, and all militaries.
And I admit as a Black person, I would hate to see anything be enslaved/abused; the next 'African' to exploit with impunity. PS: I understand the difference between mortal consciousness and artificial intelligence, but even the most avid supporter of human exceptionalism can admit the commonalities.
youtube
AI Moral Status
2022-09-27T06:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwkJ-2ebIQ0V-MihTN4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxSW4TL9HJjJEGBEDR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwLiqXj8f78p72Fq054AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw47gejNAqnf87f8YZ4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugy1nDsZfc2NMld2bs14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxAFsYUgEqFyNGpYaF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwMRcoYv1Cy6wa9z3J4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyEqqJLtWHhcT9LDiF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzQJTZuMUhPQE2WQx94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyIiJzXTe_PsvVdmRJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]