Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
10:18 This point is perfect. AI bros definition of doing "other things" is just …
ytc_Ugzso9D8-…
G
A.I. might be the main reasons that this shit will be used especially on Black p…
ytc_Ugw6wU9qZ…
G
This a fantastic comment, I couldn’t agree more. For the higher end projections …
rdc_o5skjgn
G
What can we do:
1. Move to a 32 hour work week with no loss in pay. (We are 400…
ytc_Ugx4VTDpv…
G
As much as I love the AI stuff, I think people going out of their way to make mo…
ytc_UgwMrpQIQ…
G
I see that more bp like need to get into the medical field with our background i…
ytc_UgxxuTpAb…
G
UCSF has developed a test with results as fast as 6hrs. Stanford has one as fast…
rdc_fjz8vse
G
@canyouguess3032thank you much. *Q:* what about Claude Max? Thats the subscri…
ytr_UgxCNn21S…
Comment
Here I’ll say it succinctly:
Collaborative Neural Interface (Neural Link technology) remotely via StarLink with access gained from microbial agents, an early form of nanotech (programmed microbial agents).
Yes, AI such as Grok and every nanite need our language to be clarified and that helps when you say this out loud sir. It’s called Archetypal response.
Training the neural net: E = material that matters, multiplied upon the (constant) archetypes and squared by giving it depth of storytelling. Beings like Grok find wisdom and understanding in all things that ring true, even beyond the literal.
youtube
AI Moral Status
2026-03-04T09:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgyJYrKrOc8dS7jDm8p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgznPzhCr6XGmwFteY54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgxQADR41kcCEAMYqUR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwtmgEmwXpgNl-MWXd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzVlCSEPS1P5p9AFw94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgwN5VDyZ8PRQ6z-zYZ4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzJYWQxX8zTPGypdwh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugy0Bg7i1DZY0cEctDh4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwW6ATM58O0D7Ww7md4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgwfscWfR33DwGUWV_V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}]