Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think, until we swap to photonic CPUs, AI will be too vastly limited by energy…
ytc_UgyrvXPfB…
G
1) First Robotics Automation take over automobile, Mobile manufacturing etc
2) A…
ytc_Ugwc4JX2e…
G
Parents, talk to your kids. It's embarrassing but it's much better than this out…
ytc_UgxEKE6se…
G
It sounds like you’re pointing out how small and seemingly insignificant robots …
ytr_UgznqdufQ…
G
he'd better talk how AI will change formatting in MS Word which has been disgust…
ytc_UgzLgPsVv…
G
There are multiple AI pioneers on this list, including the dude who invented dee…
rdc_je3s8wj
G
Some of the original DAN prompt was blurred -- did anything in that prompt direc…
ytc_UgxtxFTRd…
G
As an artist from birth (yes I was drawing since the womb)
I find art as a chall…
ytc_Ugz7uhruv…
Comment
The premise of this book is a seriously anthropocentric, myopic take. A super-intelligent AI wouldn't need to turn Earth into a chip; it doesn't require the same environment we do to "survive". Chances are, they wouldn't see the point in staying here and competing with us. Truth is, we don't know what they'll do, but getting rid of us is litterally a waste of energy. Now I'm not saying there are 0 risks, but they are much lesser logically IF they are truly super intelligent. Dumb AI is objectively much more dangerous. And by the way we already have a paperclip maximiser AIs, it's called capitalism and corporations and it runs on people instead of silicon.
youtube
AI Moral Status
2025-10-30T20:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxdWB2GvyUuqIVlCi54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgybtjBUk39J3illv054AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy6W6lYH1D8Uj9Bwxl4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzLmQDK4VS0RkkLAUd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw83iGH3FmGlHOpS314AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw5gyINpG8jmJV9s6V4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzelWm4EbPVk114lMd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyk7e-1BrjucVChMBR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxBdApmyz7dTqviZ154AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz590g8tnUELebYGlN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"}
]