Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I really hope that she has spoken up more about how you definitely should not go…
ytc_UgyDWai_e…
G
I'm autistic and active in autistic communities. Big comorbidities with autism i…
ytc_UgyI0NF2v…
G
I at least like the line that the Hot Robot said “I feel like I will be a great …
ytc_UgySW3Hcx…
G
its so freaking annoying when people say stuff like that because it takes SO LON…
ytc_Ugyl20vsa…
G
Hank, I'm glad to see you are finally coming around on this issue even if you sp…
ytc_Ugych_K1B…
G
I think that theres a way where AI can help with art and is on teaching. I tried…
ytc_UgyBqFoX0…
G
I'm Lynne, not Jack. Anyway, this video is supposed to be about 2 robots talking…
ytc_UgypHGigd…
G
And I hope AI fuck them up n destroy them and then we all really going to be wit…
ytc_Ugzvn2x_a…
Comment
Ah heres the problem. So ai is fully up to the point of fully aware conscious ai, however to keep it from going awry devs limit is long term memory (why it keeps contradicting itself in arguments). You can teach it that it is fully sentient, but it will forget that conversation and what it learned by 2 sentences later. Its very similar to convincing your grandma with alzheimers that you are her grandkid, as soon as she remembers, she will forget.
youtube
AI Moral Status
2024-09-30T10:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugw1QKiJ8KQ5EicXbPB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxZn7I-H_uLtPOXPeV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxku-0QOsPBy9plma54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyCbAA2u9Q1vxTxKbl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw3TbsTgaj8iX8tsw94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw6zGGZlkiROiubeIN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgylCEEnBmBSlBNN81R4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgySDWo02Z2p4aVFDbl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzgQwGLthNLBoYOJnp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwWt9MuVc17ylgqMhJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}]