Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They wanted to create something to help make life and certain things easier but …
ytc_UgxBeShep…
G
So my take is that we merge with the machines, and time and space become somethi…
ytc_UgxTdCj2P…
G
Is not a conspiracy theory. Happened to me too multiple times. I've sent on week…
ytc_Ugyg5VsMp…
G
I imagine that most problems that may arise from or for AI will likely be easier…
ytc_UgjQVxHcT…
G
If the bad data gets deleted then the people who poisoned their art won't get th…
ytr_UgyHnxHqh…
G
I am a HUMONGOUS advocate for automation... of mundane tasks. Finding ways to ov…
ytc_UgxsOMt_2…
G
They weren’t “inspired” by the AI image, they’re making fun of it and people lik…
ytr_UgzZEEy7V…
G
Figure its the people with one of the blandest, easiest to copy art styles, is t…
ytc_UgzvlYhYQ…
Comment
Hallucinations are how we know that these models aren’t as smart or as alien as people think they are. Being confidently wrong is an intrinsic part of simply taking a guess. LLMs are giant power hungry guessing machines, complex matrix multiplication multiplexers. They roleplay for us because the data overwhelming shows that we react meaningfully to the situations they play into.
There’s nothing more, all the warnings about the future are true and the precaution needs to be here. But fear mongering about LLMs is just silly. They are not the model that will develop super intelligence. Until we can create live neural networks with ever evolving weighting that doesn’t collapse into catastrophic lobotomy, we will not crack ‘AGI’.
youtube
AI Moral Status
2026-04-02T01:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxwzM2FWV9QykhmaIh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzIGGK5N0oM8adF7sR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw4vxkPzhMjak95oX54AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw5f4XTSX_74rR2zw94AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxoVxC5B-e0hhAHM4B4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyEIaJ1nxbVLrpG3Jt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyDva_9ryRGJNMkxlZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz1UyNYOAdzUyQfGXd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzd5GIMWp4ERFLF_294AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyQwLr4AvUyEHkOekF4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"approval"}
]