Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's not that nobody *knows* what they are, it's that nobody *agrees* what they …
rdc_n3ol68p
G
Yep I realized this when I tried to test it against a router that couldnt talk t…
ytr_UgzLeAur3…
G
That’s cause these idiots keep letting their ai have access to the internet, im …
ytc_Ugzug1bgq…
G
Indeed, this AI idea will kill future animators and will not even make their job…
ytc_UgxsfHh3p…
G
I predict the A.I. bubble will explode in the tech company’s faces, and they scr…
ytc_UgzjCER6h…
G
Your so awesome. When will you come out with your pertinent version of AI Law? …
ytc_UgzGBGmFA…
G
Because AI steals pieces of other art and make "new art" nothing great about AI…
ytr_UgzJq0icy…
G
Ai is the natural progression of human technology development. Think about the t…
ytc_Ugz2hoj8g…
Comment
It is beneficial to separate ideas of what is from what should be. I’d agree with the author of the article that what *is* is that people assign moral worth according to positive personal relationships with others - people, AI, even inanimate objects like a favorite sweater. The trouble is that this is so subjective - what has moral worth to me may not to someone else, which makes it hard to form rules and laws that will govern everybody.
Metzinger and (Crane, was it?) were arguing from the standpoint of what *should* *be*. Much easier to make a law once you’ve agreed on a definition, but as pointed out, those definitions are darn tricky.
reddit
AI Responsibility
1615647392.0
♥ 39
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | contractualist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"rdc_oi03fzo","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"rdc_gqsx5ta","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"mixed"},
{"id":"rdc_gqxvyyx","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"mixed"},
{"id":"rdc_gqtv6zv","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"rdc_gqt3k8t","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]