Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This time unfortunately its different due to the nature of the technology. AI i…
ytc_Ugyza-1uO…
G
what religion? she didnt mention any religious issues , she literally advocaing …
ytr_Ugx1sdUCH…
G
The real danger is the death of human creativity. People wil pay to consume AI s…
ytc_UgwuPBwas…
G
AI is a shell game to get H1-b workers to do the job for cheaper. But, you get w…
ytc_Ugx3tSD46…
G
When human makes idol and fear it like God then human can very well build AI tec…
ytc_Ugwm16gDJ…
G
the AI-driven dystopia we find ourselves in, is the natural end point of capital…
ytc_UgyKoi2Hq…
G
People: ai is doomed ai gonna take our jobs and kill us meanwhile ai :…
ytc_Ugwj_P8bd…
G
GPT can't be scared lol. Also you can adjust GPT to give you direct answers and …
ytc_Ugz3mFH4p…
Comment
The hard part isn't deciding if they deserve rights or not, the hard part is determining if they are actually conscious. If a robot actually is conscious and feels emotions and pain, then they should hold equal rights. If a robot feels pain but not emotions such as animals, then they have the right not to be tortured just like torturing animals is sick and illegal. If a robot feels emotions but not pain, then it has the right not to be emotionally tortured just like that of humans.
youtube
AI Moral Status
2017-02-24T21:4…
♥ 31
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugi7kG8Ji4CkN3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjWtK98dVOiO3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgipEs5BcXU2Z3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UggkudIeHsDg73gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UggFU3s3bpetwXgCoAEC","responsibility":"ai_itself","reasoning":"contractualist","policy":"none","emotion":"approval"},
{"id":"ytc_UggW2mHw9QpLJ3gCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgijNLd-v6PQO3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugj1m65ckfcSAHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UghtCdi-rbmhM3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UghB59eFQ0-173gCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"indifference"}
]