Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI lacks human like intelligence because it doesn't have the ability to seek res…
ytc_UgwA0pTB5…
G
Usually raspberry thrive in cold area... blueberry... cherry tomato... apple... …
rdc_eh5febg
G
All y’all had useless jobs amd probably looked down on men who do labor thst can…
ytc_UgywsMe5y…
G
Apple Maps is nice now, but the information on it is frequently wrong or out of …
rdc_ohw89dv
G
Please give me the same money I make now just to sit around, it won't impact my …
ytc_UgwtcBvZv…
G
The fact that they said this keeps happening in multiple states makes me so co…
ytc_UgyEJNDO4…
G
You know that chat gpt and other ai’s have no fool proof policy. Like you can me…
ytc_Ugx3ifvOA…
G
That's an interesting perspective! Sophia does have a unique appearance that can…
ytr_UgxPgbsDL…
Comment
well we make up 'rights' for ourselves due to evolutionary reasons that shaped individuals of a species, to have mechanisms for some degree of 'self-preservation'. We 'want to live' or at least avoid suffering . Just don't put such mechanism into robots and done. No need to wonder do robots need rights. But lets assume a scenario that includes the word 'rights' :D . Since we are the evolutionary pressures of robots, we define them. So if they evolve 'needs' it would be to serve humans. If a robot is denied to serve humanity, it would be criminal, violation of its 'rights'.
youtube
AI Moral Status
2017-02-23T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ugi9JBgVFW7oangCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgjKu1CKzxZGdngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UghXX_gELZT3ingCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgifRi41S8frQngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugg9LqPx7RltpHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugir2u5eiBzlx3gCoAEC","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UggHSpuqEcj8n3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgiGsncAv-tGKHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UghgvagOqcklJngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugj67KJQ_UoAxHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}]