Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Haha, it might be a bit warm under those lights! But Sophia seems to handle it w…
ytr_UgxAPRpfM…
G
Can't wait for AI to take over the world. The world ain't really great now is it…
ytc_Ugwc4HoJM…
G
If real artist didnt have so much pride id hire them but AI is free and keeps it…
ytc_UgzRP5tW0…
G
Bro, I'm an artist and the person who uses AI wouldn't pay for our work even bef…
ytc_UgwzMkrRl…
G
Every websites customer service is ai and none of them can handle your basic que…
ytc_UgzyzopRd…
G
Like, yes cars disrupts the horse industry. At the same time, entering a car in…
ytc_UgwGmZatD…
G
AI poses significant risks, including the potential for extinction. Many experts…
ytr_UgyPDiehX…
G
Vedal - I have created an intelligent AI Which can be used as a stream companion…
ytc_UgxYX3o9_…
Comment
we're are not that close to life-like artificial intelligence. Our computers and their type of programing(binary) does not allow for anything close to human thought. As far as I can tell we won't even be capable of that level of thought until we develop and integrate quantum physics into our computers. Even still it would take clever programming to even achieve any sense of morality. Im doubtful that people will want to create artificial beings capable of feeling pain and loss. How evil would it be just to conceive their design. I would call it a crime against nature. Robots are tools and achievement, something we humans can understand the pain of. Why would we let them feel it as well, and how do we benefit from such endeavors?
youtube
AI Moral Status
2017-02-23T19:2…
♥ 35
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugj9uA4E2qdNfHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjnOffaiIS5qHgCoAEC","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugj98md7zFOMrHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgitNrH9VLI5X3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Uggkdf3AcQC3ZHgCoAEC","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UggqumG_AwEw_ngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UggJ3-NtmsdA4ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UghacdqQa_8JXXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UggZ2aPEfECZoXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgiBCDn6kZ0PaHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}
]