Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Is that true though? The major considerations regarding AI aren't all that technical. The major questions are this kind of nature: If we create something far smarter than us, can we control it? Can we avoid doomsday? Should we build a machine god anyway? It would be a higher life form after all, maybe that is the purpose of humanity? If there is large uncertainty over whether it will remain benign, do we take the risk? If we can control it, and all jobs are automated, how can we ensure society doesn't collapse? Will the ones in charge of AI even want to avoid it? That seems dubious, they are investing heavily into doomsday bunkers... Will they have any incentive to provide any goods for the rest of us? Current AI companies are already clearly not aligned. These are more political and philosophical in nature.
youtube AI Jobs 2025-06-13T08:5… ♥ 4
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningmixed
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytr_UgzSu1HcZk6p67hBdb54AaABAg.AJIdOFUna9lAJIq0GuuF5J","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwroUIfAmKaOwiHMtV4AaABAg.AJIXX91ziLyAJJPggqu_Qn","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgycFpCROyoKVhBC9E14AaABAg.AJIRoZ6hE4oAJJOewo0AaC","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgycFpCROyoKVhBC9E14AaABAg.AJIRoZ6hE4oAJJlHAUJS0I","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgycFpCROyoKVhBC9E14AaABAg.AJIRoZ6hE4oAJJvSHdzojG","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytr_UgzU65YUZGLOWMPEyO94AaABAg.AJIR2iJMVofAJIxdbFzabe","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"}, {"id":"ytr_Ugyn5ps2Uq02zycHfOl4AaABAg.AJHy6Aax32AJIryefAGlA","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytr_Ugy4Ow0AGzWAHF7Jr_d4AaABAg.AJHfLfZBe_FAJJOclqdcRz","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytr_Ugw13-szwsZNSx992H54AaABAg.AJHW95ClHJFAJIGllL4rGF","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_Ugw13-szwsZNSx992H54AaABAg.AJHW95ClHJFAJIrp1J8wWc","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"outrage"} ]