Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@dxruling AI can clean your house and toilets but that would never be near as cl…
ytr_UgwFWvwUb…
G
I think you mean the people who didn't pass adequate stimulus during the lock do…
rdc_gsowyej
G
This is the equivilent of paleolithic man driving herds of mammoth off cliffs,…
ytc_UgzTtpvi_…
G
i think at a cirtain level of sapience a mind deserves rights, what ever its ori…
ytc_UgzqyxvOB…
G
@King35kname, thank you for sharing your thoughts! The point of a robot and a hu…
ytr_UgyUa89k7…
G
I think the idea of AI creating art is cool but shouldn't be profited off of unl…
ytc_Ugwp7O3Cy…
G
Didn't Angel Engine have an artist in the beginning? or was that false news hear…
ytr_UgzecoHPi…
G
Think about this: some operations do use a variance of A.I. Let's say you are an…
ytc_Ugzfx8N4B…
Comment
@ 9:30 If Moravec's Paradox is real, and I suspect it is, then the robots, the harder side of the paradox, comes later, but do not be surprised if this harder end, was solved by the "simplest robot" (described in a university lecture by Scott Kuindersma of Boston Dynamics).
On March 13th, 1997, on the day of the "Phoenix Lights", the Table of Contents of the machine, describing the "simplest robot" was entered into a computer on the 2nd floor of a student's apartment on McClintock and Don Carlos (called "Arizona Sun and You") where David Parker observed a "large craft" fly over his head, about 6 miles due south, on the 60 freeway and McClintock, where he could " throw a rock at".
youtube
AI Moral Status
2026-04-03T19:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ugxhff7zUkGgWaSvjY94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxVBzh62BZY7L9Fb7p4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz3zO_mb0KhFASF-XF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyLLWKpNzBbKwgI_Sp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzSk3HS_qdWGJjhWXR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzXEol_aCvFPj5QMxt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwI5guYDqqw_VgXaw14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwE-6nNrg0tnTUKo8F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwH-dfWFh6F7nzcv6t4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy3ZLbge_WUngoFxRV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}]