Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Once we give birth to a truly sentient AI, odds are we are done for. People kind…
ytc_UgwsbGlsb…
G
What is a human programmed to do? Each of us has a choice. I can tell you if we …
ytc_Ugx-gkcUY…
G
I was trying to hire a jr about 6 mos ago it was dismal. I had devs telling me t…
ytr_Ugz7Kt2T_…
G
Aw well lookit this little channel YT recommended me! As an artist allow me to s…
ytc_Ugx35pDXY…
G
I asked if I was in the right for throwing a neutron star at my dog after gettin…
ytc_UgyotTKgi…
G
Nah. I know coding, but still get AI to guide me through the planning stages, te…
ytr_UgyFisiVl…
G
It went from AI taking their jobs to ……. Race .. ….. you can just tell this pers…
ytc_Ugxl_E-zD…
G
Us: Can we get the type of capitalism that gives us subways and high speed rail …
ytc_UgyqtTbrR…
Comment
I wish you had gone further than looking at consciousness. I think the question of free will is a more interesting one. A robot might be sentient, but without cognitive capability for free will, I think there is little basis for robot rights.
Instead of the example of a robot feeling pain, I think it'd be more interesting to examine a robot who is prohibited from exercising it's free will. Furthermore, what if humans in the future forbid robots to choose for themselves because of the potentially disastrous consequence of a powerful robot feeling remorse or anxiety associated with that freedom to choose?
youtube
AI Moral Status
2017-02-23T17:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UggjZy8rcjGNm3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Uginb6UDLQhk_ngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgicvaZjhMX6BHgCoAEC","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_Ugh9eNuF4VTjWHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},{"id":"ytc_UggntMP2kdIoWHgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_UgjEH-SjZ2pMMHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgiDLalxifsbkngCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgiXwA6zw6dnqHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"approval"},{"id":"ytc_Ugh2D6_lDm1Rc3gCoAEC","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"resignation"},{"id":"ytc_UghxYPawvqCsbXgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"}]