Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Russians the craziest mfs alive I know I ain’t waking up and sayin yeah lemme fi…
ytc_Ugw6Ntv6c…
G
Sam and all other ai ceos and their companies can go to h3ll for killing creativ…
ytr_UgzSPTpPo…
G
Yes, but AI can only use information that already exists, it cannot create anyth…
ytc_Ugz4lRo7X…
G
Any type of self learning model (AI) is programmed to make advanced calculations…
ytc_UgzR9i35e…
G
@salinman9034 An ai, something that replaces people and steals their jobs and ta…
ytr_Ugys0pdiQ…
G
Well, it is calling it correctly….ARTIFICIAL INTELLIGENCE, right? It ain’t ‘HOOM…
ytc_UgyOAGYJq…
G
It's gonna be fine. People said that AI would take over therapeutic practice. It…
ytc_UgxZpGlmA…
G
@Nothingseen
"Wait, AI bros, tech bros, crypto bros... are all just obnoxious g…
ytr_Ugx_BfD8C…
Comment
wrong question.
what is the difference between artificial intelligence and real intelligence?
yes they will have to be conscious and capable of understanding similar emotional abstracts as humans to gain rights, but at that point their intelligence is real, not artificial. Because they are no longer just simulating emotional reaction but experiencing it.
as long as it is a fabricated simulation with stringent preset variables they have no rights because they have no real thoughts.
this doesn't make the definition easy, but it does make it a little more quantifiable.
People could argue that we are programmed by biology and experience to be a certain way and at present, we have no solid way to rebuke that because we just don't know enough about how the brain works or if our self-awareness really lives there.
youtube
AI Moral Status
2017-02-23T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UghrMjkvJvyYYngCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UggI3A8osDidtHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghmvI-rbPE643gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UghqU14UzYTlX3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugj1f08yN6lvxngCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugi6L3X2cbXbKHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UggpTYlx4yYgFXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugj0GG7r64jiHHgCoAEC","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UggqmDrEGGZ5_3gCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugg0POrMdU18w3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}]