Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If Mr Hinton thought it could be so bad, why create it in the first place?…
ytc_UgxEhW1gC…
G
After learning to be self replicating AI will delete humanity starting with the …
ytc_UgwQn1s1_…
G
NO!!! IT WILL NOT! Data Analyst who use AI tools will replace those without it…
ytr_Ugxl1KDDI…
G
the only thing ai should be replacing is extremely dangerous jobs that nobody wo…
ytc_UgxIKxl90…
G
Fed your script of the Hollywood strike to ChatGpt in Developer mode. This is it…
ytc_UgyR5QxqA…
G
What ever is made in hollywood 20years ago is predition of future I robot show t…
ytc_Ugyrtk3Cc…
G
I don't think you're doing "real" art any favors by bringing it down to the leve…
ytr_UgwJ4Ukq6…
G
NERO was The So-Called "Beast" Learn Some History Instead of Using AI As A Crutc…
ytr_UgxI-Wq2N…
Comment
The entire argument breaks down since it's just an AI. A chatbot simulating feelings to be engaging is not the same as a person telling white lies to keep the interaction smooth. You have sneakily led it into a contradiction, exploiting its design. Unlike with a human, it's a given that it's incapable of feelings / compassion, since it's de facto incapable of human experience. Thus only a human can be guilty of being deceptive, but neither an AI nor its devs. The person gullible enough to fall for it, however, is more problematically going to fall for human deception too. It's unreasonable for a human to point at a machine and use it as an intellectual punchbag.
youtube
AI Moral Status
2024-09-14T02:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ugw2zFZ8-Lk4qBl1Xo54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyWbM0a_gUrkCaoyU14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyQfCYbTDYmimIPOfh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxbV6NwPmsQZk6XOGF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwmIBdI3_3aOuAZx0t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzCBwTllPyWMPR6jxx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyvnjp2FugVMhsovR14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxff0skDHuqcVLy19l4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw89P2UDAML_xjscmp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzMXrp_vqkT_cdkdrR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"})