Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Vote for laws that restrict A.I images,
So it remains as assistant text, and not…
ytc_UgzH2AcWT…
G
And psychopaths like to tell themselves they make more logical decisions then th…
ytr_UgyUK8ayV…
G
@johnsmithy7918 All of the authors involved in this lawsuit have their books for…
ytr_Ugy8mrgxS…
G
We don't need AI. It's going against the most powerful ruler of all, God. He's t…
ytc_UgwDxHEgf…
G
@Stiwardehamstrongue5134, thank you for your comment! I'm truly impressed that y…
ytr_Ugz276ivm…
G
Another issue or concept to consider is what impact it would have on humanity in…
ytc_UgwBxm56o…
G
Congratulations Elon Musk The Man The Myth The living Legend absolutely 🚀🇺🇲🚀💕💖💕 …
ytc_UgxApTgo3…
G
Ai slop is like watching millions of people try Powerpoint and use tons of trans…
ytc_UgyOUXDOi…
Comment
Sabine hits the nail on the head regarding AI’s 'black box' issue. The fundamental reason behind these 'unfixable' problems is that we are still relying on probabilistic language simulation rather than structural logic. I've been exploring a framework designed to move AI from mere 'next-token prediction' to actual 'logical construction.' If we don't fix the underlying architecture, we are just building on sand. Anyone else thinking about the shift from Prompt Engineering to Logic Governance?
youtube
2026-04-02T06:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz6OOdMfn6y-cyq4aN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwClhTmjvC4_RCcNaF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzC82t813B0PADVMal4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwyC_035_l494Y8DZJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz4Vv7kpGlkANwLfsl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzdCrNVtyGe7ZyGsXx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzA91hdgQqC00T-qsN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugze9WD8OEAk165fGvp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwC6Z0d1-MPT3lg2GV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxfCip9JXNkjxBgol54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]