Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm in charge of my own destiny...so rathercontrol the vehicle than depend on an…
ytc_UgzecGT9C…
G
So If AI is smart and doesn't need us then we could have a problem with AI. Wh…
ytc_UgyThGGmM…
G
Response:
Hey there! It's fascinating, isn't it? AI's ability to be humorous is …
ytr_UgwuDm26X…
G
i used to work at H&M, i dunno where you live that it's so high but you coul…
rdc_d3s7woi
G
There will be an endless supply of cheap to free AI. They will replace people fo…
rdc_mrrciwr
G
I stand with you. I've been avoiding pictures taken by my phone and post it in t…
ytc_UgwZA0i5z…
G
“… We should think about the feelings of the AI…” now it’s getting personal HAL.…
ytc_UgyZWNKgK…
G
Actually they make u realize how important time is and how important it is to gr…
ytr_UgzhM1wQc…
Comment
Here are just some of my personal ideas about the topic, no offense to anyone, I just want to raise some questions for further discussions:
Let's just say we can gives robots their rights and self consciousness, so the AI should be able to feel and behave like a normal human being. So as a normal human beings, we always have curiosity and the eagerness to explore what's out there, we want to interact and expand our knowledge and vision as time goes on. So how could a machine doing such things if in the present day, we still haven't found a way to sustain a life of a robot. Every living things needs energy to work, just as we need eating everyday, the robots need to plug themselves in a "power staion" to recharge themselves and also they need to carry a battery that is strong enough to help them sustain at least for 6-8 hours in order to be like human. This I think leads to a problem, we can sustains a small AI like SIRI for a full day in our phone now, but we still havent able to create a battery that is reasonably compact and economically enough for a full "human scale" robot.
So I think the bigger question we need to solve first if we want robots to be treated like a "next-gen human society" or be equally as human is not whether we should afraid of what will happens if we give them consciousness, but How can we be able to provide them the basic needs for their survival as a normal human being and make them feels that they are being treated equally as us and they should be able to explore themselves as well as being educated.
I strongly believe that the main problem that could causes robot to turn against us is because people don't treated them as a person and that is really hurtful for a conscious mind and that way also teaches them to treated themselves as a "should be more advance race than Human" and thus leading to their uprising behaviors toward their creator.
youtube
AI Moral Status
2017-02-23T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgiJxrcUrvuH_3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgiTPqoAdEgMr3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UggC_jx4u5W3BXgCoAEC","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UghhWiVkOMPmungCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UghBsb6B-kdY_XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg9i1U5KLMObngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UghODAUsQRPifngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Uggq231mY4_ztHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgjF3w78FAbALXgCoAEC","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiiPSD_XyGyKXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}]