Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It's actually somewhat worst than you think. You would have warring agents for supremacy and humans would be caught in the middle. Robots would take over your job and soon as humans loose income to buy anything, there will be no universal income as super intelligent AI will decide humans are not needed and a burden to its society. Then everything will change quickly as a bipedal robots might no longer be needed and a massive war amongst robots destroys the whole entire flora and fauna of tge world. Take note that biological systems would not be needed by these advanced AI mechanized systems. Thus rendering the whole entire biology of the earth useless. This is really really bad. I have heard of a possible scenario where some aliens have eradicated all animals on their home planets because they found them to be a neusance. If you go into any hardware store the pesticide, herbicide iles are the biggest money makers. In which direction do you think we are headed. Total absolute destruction of the world's biology. Yes, that includes us unless we get smart and get some self control.
youtube AI Governance 2025-10-12T05:0…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyban
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyLA8Y7VoD6kf25Vdx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugzl5qr8KFF07qyU4494AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzxOIOEq3sB6d9Ixyp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgzcesdCYI_cbuX4sa94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxriHHZFIxVavot6KN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgwbdT-rrRmt_3qkwn94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyjNv5vHHFN-AX_SZF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzT36Y-BsGKNPUabVh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"resignation"}, {"id":"ytc_Ugz5hMk7ugTttpZ-gIZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgwLRVaFQZLUDMznC7h4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"} ]