Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What these smart AI folks do not seem to believe in, and thus account for, is th…
ytc_UgztG9tDv…
G
AI security / programmers had better start emphasizing ethical programming. That…
ytc_UgzzUYBms…
G
I will laugh so hard when those board members are replaced by ai board members…
ytc_UgyjiERZY…
G
I would say, it's not gonna happen nationwide. There are so many issues with dri…
ytc_UgzUPndEW…
G
Why these losers just make AI girlfriends if they are that pathetically desperat…
ytc_Ugz2ijxcQ…
G
the argument that disabled people should just use AI if they can't write is the …
ytc_UgxPt0l3j…
G
If AI was this good in 2019 we wouldn't have had to suffer through GOT S8.…
ytc_UgyTTN5yM…
G
Stupid, AI just follows prompts, not it's own agenda. There is far too much hype…
ytc_Ugw2fLG_f…
Comment
A kind of pain is already programmed into computers. If we take "living" pain as something that uncontrollably focuses our attention on a specific thing(pleasure too). The difference is one represents a resource negative and the other represents resource positive for the individual or group. This could be equated to program error management, efficiency algorithms or any other interrupt that hinders a computers efficiency to deal with something specific outside of it's immediate task. What is missing is layers of complexity and self determination.
youtube
AI Moral Status
2017-02-26T16:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UggxuzS4c5UU2ngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgjYJv9T9YkFhXgCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UggKdvdoifxIKXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugjv8_ZPZwITtHgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UghqiS4AGQvTCngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UggKdKSmQyWs-XgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggpGkl0EFbTangCoAEC","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugi2-dOuWWAOd3gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgjfOOUww9Lpc3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UghVOYyM5bbFNXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"}
]