Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm from Amazon and its not trash. Unlimited usage of opus 4.1 with mcp servers …
rdc_n9sm8ue
G
The entire objection is being afraid of somebody knowing who a person is.
That'…
ytc_UgxybIIuh…
G
100% agree with your description of the value of art.
Just imagine if the AI br…
ytr_UgxOtlEpy…
G
The reason self driving cars aren't worth it for good drivers is a self driving …
ytc_Ugwnck_93…
G
Ido hate ai art don't get me wrong, but that first recreation does bother me, do…
ytc_UgxNN7Smi…
G
Even in bad art, every stroke has an intent behind it. The artist had an idea th…
ytc_Ugz9gRqYH…
G
Yeah but what we can do? We need AI for a more comfortable life and scientific d…
ytc_UgxAzaWBK…
G
This has the potential to be more influential to society than the entire interne…
ytc_UgzqXwGYO…
Comment
"Beings" are just biological machines.
"Consciousness" is likely just randomly evolved biological systems that favors survival through the use of memory. All we need for AI to become (flaw) like humans is to program "them" to favor survival, superiority.
We already enslave people through economics, why is it so hard to imagine "enslaving" machines?
youtube
AI Moral Status
2017-02-25T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugh_lhwycoYkl3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UggdcRoBuxL9z3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Uggf3fSG7XhI2ngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggNUjIVjFsz73gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UggzsyIZT_K1A3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugh32Vghx0DeXHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugh7NSvs2yysSXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"indifference"},
{"id":"ytc_Ugiv_A9GlQMe1XgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugh4RafM8_B_dXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugjj2FpuF6iXDngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]