Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Who needs the real deal when you can have, well err the near real deal, err or i…
ytc_UgzCFX_RD…
G
you don't need robots to kill the trades. The knowledge os whats valuable, you w…
ytc_UgwHU_W62…
G
Wow wow wow. This is freaking scary ass crap. Put these makers in prison now. Th…
ytc_UgzmNrN_G…
G
Ya know, it's sad none of these "scholars" understand the true nature of AI and…
ytc_UgxTsiiW_…
G
I'm a disabled artist. I have severe fatigue problems, and I can't draw for long…
ytc_UgzpULZg0…
G
"Brain-Jacking risks in 2026 are real, but defense has evolved. We just validate…
ytc_Ugz2Uv_w1…
G
what asmon said its quite right, im an artist, and it will happen, because once …
ytc_UgzGFADWW…
G
Laughable material. Drama queens. Can be reduced to a few phrases. “Lots of DC a…
ytc_UgwJ1VS2t…
Comment
4:55 This is getting close to The Big Question(s), in my opinion. Today we have LLMs. LLMs are not intelligent or self aware. So two things that come to mind for me are:
1. Will today's AI actually be the technology that leads to actual "Intelligent" machines (in other words, "will what got us here, get us there?")
2. How long will that take?
I think about electronics, for example. Many of today's electronics were first discovered/theorized/implemented in the 1800s to early 1900s, but weren't properly produced and deployed into the world until the 1960s to 1970s, I believe mostly because of the solid-state transistor. That was the quantum leap that led to a whole new way of doing things. It was a proper technological revolution. So the transistor of the AI world... we may not hit it for 100 more years. Or ever. Or we may hit it in 5 years. I wonder if actual AI experts have more insight on this topic.
youtube
AI Moral Status
2025-10-30T21:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzHCH_7D3Io1A9ZfUt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgydK4YU0WvkkXDhLZR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyLW75ItQyohqOU8-x4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyi3pryPPZ16W5-jrN4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyAcSPetC-PdFpwvhx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyCbY8TYZcio_FCw7B4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxqV2VekvkpMAdPBXd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwR5aqfElxaSpKXGOl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyOXNQrSMo9rDaxXcJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz35HnxfBiL56aUr4J4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"}
]