Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ya well no shit!!! Everything says made in China. What did you think? They just …
rdc_gx6xl2m
G
@Phel-z7r8v Then blame the person who even requested to use AI, it's still not …
ytr_UgwsotmRI…
G
the funny thing is...if scaling LLMs doesn't lead to AI/AGI (most scientists agr…
ytc_Ugzz5VMBm…
G
17:53. AI fails in most all of these because in general it is nothing more than …
ytc_UgxyLE29g…
G
AI users should be brave enough to advertise themselves as AI users. Period.
Re…
ytc_UgyQLScKK…
G
Frankly, after seeing all the autonomous weapons that Ukraine has created, the U…
ytc_Ugwd9UIyJ…
G
I bet the AI leaves a touch of frustration on accident, but the developers just …
ytc_UgwCjilEq…
G
I remember being in art class in school and having my teacher make fun of me and…
ytc_UgwtGz0cE…
Comment
To the title question: NO.
Since, they are not naturally born, and not more than mettallic components and some script (since there is nothing like 'AI', and never will be, only a bunch of scripts), even there is no question here.
Since you dont even give rights to a single rock or a tree leave, no base why would you give rights to another object, what is not living, never been, never will be, only made on purpose - to LOOK LIKE that.
Debate is over.
youtube
AI Moral Status
2020-05-15T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxFwg13HIwDYvN1xzB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzYCNhwxammyrS6RO14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwjBPwzT_Qw7n9UD2d4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw7cMFZrGGurrR-LaB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzFihmfK6GnXiI18aR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwF7AYOXBXbRHE0Bx54AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxQB_SJFgAqADe4Bm54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwKaH_8G5iEjt8UWut4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzY-XCMoxUorvFlSgR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzRnuXxDby7aOjA9Id4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]