Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
they're worried that ai will take over because the occult corrupt wealthy will b…
ytc_UgyBWM6Mw…
G
What are the regulations Elon and other tech experts wish to be implemented for …
ytc_Ugw4ipSE7…
G
Hey Kurzgesagt, could you do a video about the dangers of superintelligent AI? B…
ytc_UggqumG_A…
G
The right question is: Why you don’t create a reference system to control AI not…
ytc_Ugw7xfEzv…
G
"But disabled people need AI to make art!" bro there are people with no arms who…
ytc_UgzDaYYc2…
G
Ilya has moral compass coz he was born in Russia u all hate so much. Westerns do…
ytc_UgzByd18W…
G
It seems like societies are focused on dangerous technology (AI, nuclear weapons…
ytc_Ugy5W7ois…
G
I'm not sure that's a bad thing considering the quality of the average driver. T…
rdc_fcs1wrn
Comment
Interesting take on the subject, but what about the concept of emotion? If a robot is emotionless, does pain even affect it even if it can feel it? Without the reaction of anger or the feeling of injustice, then there's still no reason or want to demand rights. Is it even possible for a robot to have an emotion?
Dang, philosophers have a bit of a race to beat robots to the punch of answering these questions.
youtube
AI Moral Status
2017-02-24T02:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgiHxUzYsGI4e3gCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgijvnN8rxT23XgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgiVw7y25qwdLXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgjXq74qnn4w1HgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UggQpIKTMpgtzngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UghqIPRZeJxYD3gCoAEC","responsibility":"ai_itself","reasoning":"contractualist","policy":"ban","emotion":"fear"},
{"id":"ytc_UggHKar-b2b8k3gCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UghY0ZAiPP5dD3gCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UghbRTz2I1HUJHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UghmIkWSpY9XpXgCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]