Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
we cannot do anything if its for betterment of humanity, and if its not then it …
ytc_Ugx5JbLiC…
G
For me, that's the biggest thing people don't seem to realize. For most people, …
ytr_Ugydd4pFL…
G
AI + Nuclear arms all over the globe = human destruction
Get rid of the nukes!!…
ytc_UgzvTLN1F…
G
Woohoo AI says a group of people look the same? LMFAO!🤣 seriously. Get rid of th…
ytc_UgyqNyUvS…
G
No, there are 100 percent people in this world who cares that people use AI for …
ytc_UgyLdCc_p…
G
Ai nothing good can come from this creation. But worldly humans think it's cool.…
ytc_UgwfF9cxX…
G
It doesn't, the only ones propping it up from time to time are people making bai…
ytr_Ugye_KX4E…
G
@TheShinorochi you really think, they don't need data from intelligent life form…
ytr_Ugw4aFSYF…
Comment
I thought about artificial intelligence and consciousness a lot, and I've boiled it down to one simple question, if it fears death it's conscious.
Although it's interesting because right now with our current technology and limitations as well as our not knowing what consciousness is, it changes. Currently I feel like true A.I with consciousness is impossible, we don't even understand our own, how would we create and understand a robots one then, and if we ever do figure out what consciousness is, the old question arises, if you clone your exact brain and everything, is that clone still you? Because if a robot's consciousness is a bunch of 1s and 0s, that would imply we can copy it, save it, and put it in another unit. At that point the A.I. doesn't have to fear death, because if it ever gets damaged, even its main chip or whatever, we have a backup somewhere safe, we can turn the A.I. off, and restore the backup, but then again, is that a new A.I., or the old one, does the old one die and a new one is born that behaves and has the same exact memories as the old one, or is his consciousness transferred?
It's the same question we end up when we ask ourselves about replacing things, or cloning. If we manage to figure out how to save and upload our consciousness into a computer, and I upload mine, while I'm still alive, does that break the laws of physics? Because the same "thing" is in two placed at the same time, and does that mean I should be seeing through 2 set of eyes, the computer's vision and mine?
I fear that consciousness is a lot more fragile than we think, a bunch of neurons firing in specific patters to create our thoughts, emotions and memories, and as soon as that firing is cut off, it's not us anymore.
youtube
AI Moral Status
2017-08-07T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzdc5ggMEKwzkcZ07V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy1YurAqjrGaCWv-MV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzuhpMbmFcwc1EwNSt4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugwo7z4sOrI-LDRBRTN4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugxy1uFJLiB-VO4r-Fp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzUoGaTpCJ_sGD8lHB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzGQ2081wUKl_7ojaV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzx0n3rUBZhenf_JPh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugja24tjkz6vPHgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxkgXw-9xAY0JKPR2x4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}
]