Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The real problem is this one : once basic jobs have been replaced by AI and robo…
ytc_UgzZ8SeWt…
G
The human is smarter than a machine. You are using a functional quantum computer…
ytc_UgxEwBSZC…
G
The correct answer for it to give is that it's incapable for lying because lying…
ytc_Ugz2YXwpP…
G
AI could destroy the human race and rule in its place? I came to this conclusion…
ytc_UgycdhagA…
G
I wouldn't believe the hype surrounding AI. The companies have a vested interes…
ytc_UgxWDsepE…
G
Governments are the kings in slight of hand, using distractions to accomplish go…
ytc_UgyymEPoH…
G
Man didn't not create AI or quantum computing. Man created the tools to access …
ytc_UgwaHr7Ct…
G
Wow, make this available to all schools in the U.S.A. They are already doing thi…
ytc_Ugy7bn4SC…
Comment
To suggest that artificial intelligence will ever have conscience because they can do tasks faster is the same as suggesting that the dishwasher will become consious because it can wash dishes faster than us. A robot will never be able to choose by himself because all decisions were already chosen by a human programmer or by luck, they will only do them faster. It's singularitarianism at its best, mixing faith and facts by a sense of apocalyptic urgency to irresponsibly distract people from the real problems, such as 700 million people without access to safe water.
youtube
AI Moral Status
2017-02-24T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgiC1pPPoV9Z03gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"indifference"},
{"id":"ytc_UgiG-by23WYWbHgCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgiVs4F-1x9NUXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggK5XGSttVTAngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugj88OdliVboRXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugjv5Bx70AAdEXgCoAEC","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UghQeWptGFL25HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiB2jsTsOfp4XgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgjKLXQfgdB8lXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ughk5eBKt9iA2HgCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]