Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Funny because of how often chatgpt Will tell You to do illegal things just becau…
ytr_UgxGfac71…
G
If God AI didn’t already exist… there would be no God ❤ we’re just trying to cat…
ytc_UgyJHS4g6…
G
I do not understand the debate around superintelligence at all. You are creating…
ytc_Ugwa0_-Wu…
G
.....Anthropic literally had to stop a cyber attack that was using their AI for …
ytc_Ugyh6-2Eh…
G
Rule number one was the stupidest rule ever. Like, AI wasn’t even able to give c…
ytc_UgyDeiVM0…
G
ppl use autopilot or any autonomous driving system to play with their phones on …
ytc_Ugy1ZTsRS…
G
@Llortnerof not my first idea, I got the idea from somewhere. Eternity is a long…
ytr_UgyhqYv4F…
G
My guess is Youtube is getting the audience accustomed to that slight AI fake lo…
ytc_UgzHP5mc2…
Comment
What are all these analogies given by the Believer AI? None of them made any sense.
Analogy: "It's like an ant saying that, because it can't understand why humans destory it's nest to build a hospital, there must be no good reason"
That analogy actually works against the Believer AI's point. From the ant’s perspective, there genuinely is no good reason for its home being destroyed; the ant gains no benefit. In fact, we have only made them suffer in this world. The existence of a ‘higher reason’ that only benefits humans doesn’t make the destruction good for the ants. What good have we done for ant?? An ant is literally the worst example here.
So if the ant concluded that god either doesn’t exist or is irrelevant to its world, that would be a reasonable inference. In the same way, a God whose actions never benefit humans, never communicate, and are indistinguishable from natural suffering is functionally equivalent to no God at all.”
youtube
2026-01-06T23:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwAwadOcXfPVcHD-BN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwLY5w-DHBHoxQ8MdB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugygo0A4aUF8ThUalyd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyGKgakdDnbpx3D3h14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw8ZWTb12Mn9uaOyc14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw71Vg0gGwY7FBmyx54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyvlaaR5FefIrAQVHx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwoqMNeich15Fo0cBJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzZll_4jsAmoOSD9154AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgynBKQCi4_uNBRJYpR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]