Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@ParameterGrenze He's talking about how Elizier always assumes the worst outcome from a complex dynamic and almost unpredictable series of future events. It could just as easily result in something as good as eternal life. Eliziers main problem is that he imagines what ifs and appeals to emotion constantly. I've never seen anyone call him on his unsubstantiated assertions. Thomas here is saying any argument Yud uses could be equally applied to the opposite of what he says. for example: "What if it kills us all?" "what if it saves us all?" "well it could kill us all." "Yes but it could also save us all." He adds a LOT of implied logic, jumps in logic, and numerous fallacies. Which results in everyone trying to address him as being stuck in a reactionary cycle following his nonsense. For example: "If its not coherence then its not doing stuff and openai throws it out then builds an ai that is doing stuff and is more profitable. until everybody is dead." yes of course a company will make a product that works, but WHY would it automatically be doing that until everyone is dead? is it killing everyone? is that THE necessary goal of it? is he just saying it will do "stuff" until people pass on naturally? well he certainly appears to be suggesting it kills everyone and that is THE certain outcome. which it isn't. but wolfram when at the heart of it, the assumed death cult. the worst case which Elizier made a name for himself over. Eliezer dismissed that and went after openai>anthropic>meta and on the reactions go. They cut there and who knows what was said. they move on without addressing WHY he assumes the worst out of all of that.
youtube AI Governance 2024-11-12T08:2… ♥ 21
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningmixed
Policynone
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgxWbWU8HnwXTGqsyxd4AaABAg.AAiYeR1UICtAAimhHmYdLd","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxWbWU8HnwXTGqsyxd4AaABAg.AAiYeR1UICtAAjRPne2vJB","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytr_UgxWbWU8HnwXTGqsyxd4AaABAg.AAiYeR1UICtAAjgxLAap2x","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugw65QRNZqXybNcskx94AaABAg.AAiXbYlEStfAAmmuu4wRqq","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytr_Ugw65QRNZqXybNcskx94AaABAg.AAiXbYlEStfAAokZRx4eik","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytr_Ugw65QRNZqXybNcskx94AaABAg.AAiXbYlEStfAAqA9Jy7zvA","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxJUObSq16yHsAotv54AaABAg.AAiV57RtZaaAAiWFsL8hXk","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_Ugz_o_4UNQo_GhkZI294AaABAg.AAiUrQt1I1zAAidX4Bf-1N","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxWZtuw753_fj8wEad4AaABAg.AAiRc3k91HjAAjBSmlsX0M","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytr_UgwqfDGCzxp8g181J_54AaABAg.AAiNXtAPvrsAArCz1Ewg7J","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"} ]