Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
IMO, it's not about fooling an AI, it's more so the humans that need to be fooled. The humans will use AI detectors, but the devs of the software will always have blind spots, and the sheer slop rate means that the blind spots might be too fast for any to catch.
youtube 2025-11-19T18:1…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_Ugx1n9ps634Lcd30M-l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwrjC4pODznXyr2zfF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgxSjbtoItTh12BD9uB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_Ugz42x4NOPi83vYKyuV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgwQwuXX7EgIofuNAK14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_Ugyzl7-nIM4p0538BVp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugy8e6eOGKiAgnXd2bh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgyonAHvnsIsrGz1OWN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgzosFRiemlOFkoLKeZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_Ugy0MnG7tYFEa30h4MZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}]