Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If we thought that corporations and governments were manipulating us, they have …
ytc_Ugzj5IkXh…
G
I think it’s pretty clear that autonomous cars with no LIDAR or reliable distanc…
ytc_UgyyLYsj9…
G
The AI hype bubble is bursting currently. Investors know. It's time to stop be…
ytc_Ugx-t5sV0…
G
This is an overreaction. People won't opt out of politics. The standards of evid…
ytc_Ugw9OFKkc…
G
Just because we can create AI doesn’t mean we should. AI will will make billiona…
ytc_Ugyx1lAGY…
G
Well, AI can work at half of human quality for 2-3% of the cost. Just compare qu…
ytc_Ugwh-QVfK…
G
Hinton's (implied) argument doesn't really stand up to scrutiny. It directly rel…
ytc_Ugwobb24z…
G
Utopia can't exist because Utopia fails to understand how humans work. We need a…
ytc_UgxtDJOWR…
Comment
By giving a robot the ability to feel pain, you are essentially MAKING IT BE IN PAIN. Why wouldn't that be against the law to begin with?
On the other hand, if a robot's purpose was to serve humanity, a concept of pain would be crucial to completing that purpose. If the robot were destroyed, it would no longer be able to serve humanity, so therefor it would need to stay alive, which means it would need something to warn it against things damaging to that goal; something like pain.
youtube
AI Moral Status
2017-02-23T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UggBwt39ne95NHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugi3tBoCXCry5XgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgiV96vmAVd6m3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg3Gwx2PlKLe3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"approval"},
{"id":"ytc_UghOb-FChO3vGHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UghwSl5bL0NorHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UggIIAf5apT5mngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjXCxJaU4DN1ngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ughqt-XlMSOrZngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjOC6cVNxU5N3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}
]