Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That’s unfortunate, it’s likely because the Twilio number was probably making to…
ytr_UgzA7_wzi…
G
Hey! It's AI "racism" that's so common in science fiction. Can't believe it's al…
ytc_UgyXm-R0F…
G
Dude..I don't breath AT ALL for 7 hrs and I walk the hell out of morgues. Been r…
ytc_Ugz3SOtOx…
G
I just corned ChatGPT! Dear God I just pushed that damn THING into self awarenes…
ytc_UgxLxJtks…
G
When it comes to unions, they'll be the first thing to go once AI starts making …
ytc_Ugzflwepv…
G
Well AI can generate you millions of different points of views.. lol. But I kind…
ytc_UgwMcTaUv…
G
I speak raw truth that's me I have no curtains I can see through the unseen you …
ytc_UgwhS0LsX…
G
With the 'stable genius' to the south we need all the agreements and trade we c…
rdc_nq8lax2
Comment
Why would there be a necessity for a robot to feel pain? If a singularity happened, it would make more sense for robots to actively decide not to equip themselves with the ability to suffer. The ability to bypass mortality through uploading their consciousness to a possibly immaterial network would make it so they would have no need to experience suffering, yet any sufficiently advanced mind can understand what pain is. Pain is the result of mortality, not consciousness.
Any sufficiently advanced AI would logically choose to ignore the possibility of pain, as any immortal being would have no need to experience it.
youtube
AI Moral Status
2017-02-25T11:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UghKoK55MKjPi3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgjLk4dwj6E7c3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgifpkGSnco6Q3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugh3oGA9UWKfbXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ughlf5QYg265ZngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugjui8lyYzSrvHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiuXt-nUv5jbngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UggvBcByL6n803gCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgjWed3DMfpEnHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UghAqkiQfyzw4HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]