Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@captrodgers4273 No it wasn't, that's the point.
AI learns to program itself eve…
ytr_Ugz4ph9VY…
G
People who hate AI are just boomers yelling at the kids in their yard.
Get a re…
ytc_UgzgutjaD…
G
I think people are all hyped up over nothing right now. There is obviously someo…
ytc_Ugz67EfYT…
G
One thought: the move towards identifying and developing treatment for mental il…
rdc_cbw6q77
G
Well put, Adi! While AI can be very helpful, you must take steps to protect sens…
ytc_UgwXXsFV3…
G
All humans have to cease working. Time is the Sabbath. Welfare is free to humani…
ytc_UgxFzb-f8…
G
Okay, I will be honest. I don't see how failure of Ai will affect American econo…
ytc_Ugx-wLtgR…
G
All this complaining. I spent the entire month of July in the city, primarily re…
ytc_UgwNpzHiy…
Comment
A robot that doesn't know it's "bad" be broken will end up broken and useless faster than a robot that weighs the risk of a situation against the likelihood or necessity of completing a task. And I guess from that might emerge an imperative to continue functioning; a will to live.
I mean really that's why we have pain; we could've evolved without pain receptors but we wouldn't know when we'd been hurt or injured, and we'd die. So really is there a difference between our nerves, and a robots sensors? If the end goal is to avoid damage and continue functioning?
youtube
AI Moral Status
2017-02-23T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgiJxrcUrvuH_3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgiTPqoAdEgMr3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UggC_jx4u5W3BXgCoAEC","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UghhWiVkOMPmungCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UghBsb6B-kdY_XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg9i1U5KLMObngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UghODAUsQRPifngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Uggq231mY4_ztHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgjF3w78FAbALXgCoAEC","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiiPSD_XyGyKXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}]