Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You should try "AI art is no different from photography!" next. I've seen that o…
ytc_Ugz5AZ6Kg…
G
@MASKEDB Stop insulting abstract art by comparing it with the AI generated conte…
ytr_UgwonoSMf…
G
Biggest tech scam ever. Bigger than Segway.. Curved Monitors.. 3DTV's.. Facebook…
ytc_UgxXrh5nf…
G
This subreddit is being astroturfed, but not by Anthropic.
First of all, "Claud…
rdc_obyuy7s
G
IA learning works the same way the human brain works, it takes information to ma…
ytc_UgzAN5jGI…
G
No, AI, like any other piece of tech, does exactly what we tell it to do, and th…
ytc_UgzlNS7h6…
G
As a neurosurgeon I don't see a robot managing massive hemorrhages, infarcts or …
ytc_UgwvEGNAM…
G
Personally, I worry about the impact AI will have on the job market.
AI's relent…
ytc_UgwdPrQuP…
Comment
i think robots would only really demand what benefits them specifically. they'd probably demand things like more memory, stronger processors, ect. Also i don't think they'd have a purpose for pain. see pain teaches us what to avoid but an AI with assess to the internet can instantly learn the weakness to every single one of the components and it can prioritize that information so it doesn't erase it if it's memory gets full and it need to get rid of less important information. also it would really depend on how the AI is used. If an AI is put in command of a banking system and doesn't have access to any information that doesn't concern what it's incharge of then will it even care about concepts like rights? Will it even fully understand currency or will it only think of it as numbers that constantly change and move?
youtube
AI Moral Status
2017-02-23T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UghrMjkvJvyYYngCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UggI3A8osDidtHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghmvI-rbPE643gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UghqU14UzYTlX3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugj1f08yN6lvxngCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugi6L3X2cbXbKHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UggpTYlx4yYgFXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugj0GG7r64jiHHgCoAEC","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UggqmDrEGGZ5_3gCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugg0POrMdU18w3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}]