Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i don't believe robots/synths are human but they are definitely people, if a rob…
ytr_Uggiw8mjx…
G
Honestly, at this point I'm hopeless. Hell, I'm not even an adult yet, and I've …
ytc_UgxYYLkeE…
G
and here I thougt they where gonna talk about AI. they just talk about Co2…
ytc_UgxCIJp1p…
G
As a person who enjoyed drawing landscapes I had a lot of requests for helping a…
ytc_Ugzt8D8nh…
G
Given a couple more minutes and he could have had ChatGPT saying that it would b…
ytc_UgzkLRRgn…
G
@HiddenRainbowsart You misunderstand how the technology works. I totally agree w…
ytr_UgymmmcQ6…
G
People leaving jobs are often those whose roles no longer align with a company’s…
ytr_UgyK2LNrM…
G
I didn''t start the video yet and I'll probably find out, but is this about Char…
ytc_UgxSrA1M6…
Comment
Robots have a physical differences from living things. Assuming we reach a point of robots/machines/AI become self aware, their needs and their version of rights will be different. When it comes to labor an manufacturing, there is no physical exhaustion for machines. AI may experience time differently from us so their rights for time will be different. Cloud based consciousness won't need to be confine to a single body so their bodily rights would be less than that of human.
If AI and machines develops emotions, their inherit physical differences will cause them to experience their own existence fundamentally different from human or anything biological.
youtube
AI Moral Status
2017-02-23T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugi3l4d6_ZVSPngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UggXxUS6ImDcVngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UggiiBkWN73X1ngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugg5MrhnXcA4ZHgCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggRPiq5dwY9P3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugg2foK25E_ACHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjaRkRwKWzpoHgCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugg8QyIAn2PW43gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj4vFwy4jRFsngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgjrvLoOOdSbhngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]