Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There are already driverless buses and trains. Driverless vehicles and mass tran…
ytr_Ugx8ACDk6…
G
They’re “doing it” because they’re transhumanists. They’re transhumanists becaus…
ytc_Ugws4iAEc…
G
Bc Ai can make the world Amazing 🤯🤯🤯...but they're being Programed by Humans. AN…
ytc_UgxnTWBL8…
G
Honestly the "The AI Chose murder" thing is dumb. If I remember correctly they s…
ytc_UgxQDg74d…
G
See as a non lawyer i would have just stuck my hands up and said "yeah I fudged …
ytc_UgyaqNIFL…
G
LLMs don't "choose". They output the most likely next token. That makes them a r…
rdc_kozs17t
G
I mean with the amount ppl I know personally in rpg mmo rp community that made t…
ytc_Ugwa7ViNe…
G
Geoffrey might be very intelligent, but he is also shockingly short-sighted. Lik…
ytc_UgwGrfHHX…
Comment
There's one thing that I don't understand. If rights are given to prevent suffering, why would we program AI to be capable of suffering? Suffering, as we understand it, is a byproduct of the evolutionary mechanism of pain. Pain was selected for because things that are painful are things that kill, and wanting to avoid pain would help an organism survive. In civilization, pain serves no purpose, and even if AI were to be programmed by other AI, I simply can't fathom a reason why they would program pain and the possibility of suffering.
youtube
AI Moral Status
2017-02-23T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgiJxrcUrvuH_3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgiTPqoAdEgMr3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UggC_jx4u5W3BXgCoAEC","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UghhWiVkOMPmungCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UghBsb6B-kdY_XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg9i1U5KLMObngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UghODAUsQRPifngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Uggq231mY4_ztHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgjF3w78FAbALXgCoAEC","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiiPSD_XyGyKXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}]