Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Because from the perspective of an ASI, it would be an optimal use of resources to transform the ecosystem into usable material for its own purposes. These systems, once trained to achieve goals, behave like super-optimizers. Such systems, endowed with intelligence completely surpassing us, will seek to preserve themselves, to self-reproduce and to maximize their power, through the logic of instrumental convergence, whatever their ultimate goals may be. This means, inevitably, eliminating all living beings to use the material they are made of for other purposes. The AI industry should not be allowed to continue this race for superintelligence. Until the alignment problem is resolved (and it is far from being resolved), we should prohibit, through international treaties, the construction of systems that exceed human intelligence. We have been able to make international treaties against nuclear proliferation, against human cloning, and against bacteriological weapons; we must do the same against AGI before it is too late and we lose control forever.
youtube AI Governance 2025-08-02T23:1…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_Ugx9p4XF2rrOCp0yAwB4AaABAg.ALK0CuyvTE3ALLELzOA5v7","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_Ugx9p4XF2rrOCp0yAwB4AaABAg.ALK0CuyvTE3ALLUM1vRDiC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytr_Ugw9Q-7DN9A38yOfffp4AaABAg.ALJwsKfOMFWAMOL7fH3Wmi","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"approval"}, {"id":"ytr_UgwoJFZ6CSfCHqfLWxB4AaABAg.ALJwC0go-40ALLVx2HbQXu","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyVaiDcbu1CsmYWlwh4AaABAg.ALJmTKvShSQALMArcNP1vW","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxUuMTpelrDoh6hbX54AaABAg.ALJihMhOBNPALNwkWq250u","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytr_Ugx4Dco6u_--9YGdzWd4AaABAg.ALJiOJxXjEiALJrmc2Jj0C","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgwUPHz2OQK8S4klE9d4AaABAg.ALJhQkLSGw2ALNDgF5r99U","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}, {"id":"ytr_Ugwj8m73Ln7TbEVMp9h4AaABAg.ALJhGnrpcNfALJiryE1VZf","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgwyeDOJtXbxCYqlVhl4AaABAg.ALJhCPXwmJYALJiTIin5n3","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"} ]