Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@iamknowone_21had a history video with something as small as a ai voice dub and …
ytr_UgxUHKdb1…
G
They could make a difference between consumer grade chips and dangerous chips.
…
ytr_UgxlmBGpd…
G
Isn’t is obviously not sentient? Ai doesn’t have chemicals like dopamine, Serato…
ytc_UgyuIMACQ…
G
I bought my Gemini pro from a channel in telegram with a really huge discount…
ytc_Ugxq0lAgw…
G
As a person who is pretty much has autism and pretty much paralysis on the left …
ytc_UgzvZU0lV…
G
Image recognition AI's have existed for decades, and are used in military for al…
rdc_kves43a
G
The other robor messing up: oopsies
Robot: IVE HAD ENOUGH,YOU GUYS ONLY HAD ONE …
ytc_UgxJqdcP2…
G
intelligence itself needs to be studded now that it can aceed human brain. seri…
ytc_UgzJjc51z…
Comment
It comes down to evolution. The value of evolution in biological creatures is to pass on genetic legacy and ensure the survival of the species. This is simply not necessary with AI. The only reason for an AI to "evolve" is simply to create a better version of itself. Fear of death and pain is a biologically exclusive trait. It benefits us because it keeps us from doing stupid things that would prevent us from perpetuating our genetic line. Computer AI would have no need for that. It could just make a backup of itself in case of catastrophic failure, then just start over. So I conjecture that an AI will never have the need to "feel pain" or "fear death" for any reason. Therefore it will never need the protection of "rights and liberties."
youtube
AI Moral Status
2017-02-23T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgjuBzXiEFOOb3gCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiOFP9eFj2_B3gCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UggIDwaoyRdOpngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugi--P2KZ4P-SHgCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgjaQVZhHCDsIHgCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugj9Z8KXXidlHXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UggPfF3JrQEbgXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugh-PGzxflxq93gCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"unclear"},
{"id":"ytc_Ugh8FxkHzWzJP3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugi5vOQTUgGENHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]