Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@casetheacetoo Stop commenting like you have any idea what you're talking about. You say "f you’re not able to realize it’s just code and we can program code however we want to, then I think you’re probably not a software engineer, or you’re just so caught up in the hype believing you’re part of an industry building a new god." But it was pointed out many times in this conversation that it absolutely is not code. It's not even software in the traditional sense. The code we write is just the algorithm for training the AI. Once the training happens, it's no longer running on code we wrote. It's running on a grown neural net black box. How do you insert code into that? But that's besides the point (it just indicates you have no understanding of the architecture of these things). The main reason we can't put guardrails in is that this is just a known optimization impossibility. There are mathematical theorems about this. Any optimization system will trade-off underspecified variables, and we can't program in every variable that's important to human life. There are just millions. It will find one of those and deplete it to optimize for its goals.
youtube AI Governance 2025-10-17T15:3… ♥ 14
Coding Result
DimensionValue
Responsibilityuser
Reasoningmixed
Policyunclear
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgyOjzmTIoLZXJQ8_614AaABAg.AOLq_Q-3U1jAOMOxQavCEm","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyOjzmTIoLZXJQ8_614AaABAg.AOLq_Q-3U1jAOMZQfG2nIu","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgxHMLXRr5iWJK8TPC14AaABAg.AOKcxEEl8x_AOL8M15bXkM","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgxBIFZaYUxl9uHUwgR4AaABAg.AOKakxbuc3SAOMoJcYULyg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"approval"}, {"id":"ytr_UgxBIFZaYUxl9uHUwgR4AaABAg.AOKakxbuc3SAOO6NOe7PSi","responsibility":"user","reasoning":"mixed","policy":"unclear","emotion":"outrage"}, {"id":"ytr_UgxBIFZaYUxl9uHUwgR4AaABAg.AOKakxbuc3SAOOyD59Zh8i","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytr_UgwnFgN90LTWyWMpQ1x4AaABAg.AOJlHYQMm7EAOLpvvvwYb0","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytr_Ugx0eO84iCVdGa-cKip4AaABAg.AOJau-ynNbIAOKRp7QKg86","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugx0eO84iCVdGa-cKip4AaABAg.AOJau-ynNbIAOR8w2Bjz6b","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytr_Ugzt0860SlM1kjlnlQV4AaABAg.AOJZLkvQuUxAOLpkVsaMkt","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"} ]