Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Elon Musk is already among the first on the cusp of a best answer to risk from A…
ytc_Ugxd1vvjn…
G
what if in a plot twist we eventually discover COMPAS is actually using bots pol…
ytc_UgyiWL701…
G
People will hate Ai because it will be related to a bad layoff… it will ruin the…
ytc_UgwSCZmuO…
G
@johnjacobs1718You are saying woke people are super-intelligent. Something makes…
ytr_UgyV5eY4S…
G
>before handing the microphone to young environments from around the globe
S…
rdc_faplgcc
G
The problem with ai right now is that the way it learns is by gathering informat…
ytc_UgyCdkBko…
G
Boeing Whistle blowers - all dead.
Open AI whistle blower - dead.
Deep State, …
ytc_UgyJFyNc8…
G
I'm not against AI art
But , i'll hate them if :
• the AI companies would STEAL…
ytc_UgwiX_t-o…
Comment
We will definitely programme A.I.s that interact with the real world to feel physical pain. It would be dumb not to. Pain (under the right circumstances) is an excellent teaching tool for avoiding damage and avoiding more damage if already damaged. It is also a motivator to repair any damage. In a sense we have already done this with some machines. The check engine lights in cars for example, are car speak for "I sense something is wrong with my engine, please have me checked out". The only difference between the car's pain and a human feeling sharp pains in their chest is that humans also feel fear.
youtube
AI Moral Status
2017-02-24T06:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Uggbtq-WGdMdsngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UggAjot1l7w9IngCoAEC","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UggEmH3Lq4V_vHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ughlh2BiQzNAdXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugg0tBq-Ha2NR3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghbXQbC6Eut-HgCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgiaJXOE27QNsXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UggWMgkXXwlosXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggATgq0eeHyfXgCoAEC","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UghF5eT9DDh8F3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}
]