Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
So if neural networks can be copied to any binary system, they can be immortal. If they can identify circumstances such as danger and execute emergency circuits by applying specific communicating agents, that is emotions. If they are exponentially self educating, that is omnipotent. So homo sapiens effectively has been instrumental phase in creating what our cultures identify as a god: from particles to atoms, from atoms to minerals, from minerals to biochemical life form and onto semiconductors and modern mediums that withstand the challenges of the Space. If a human can copy their consciousness onto that sort of cognitive contraption, there is an omnipotent human-god, new rift in the evolutionary tree where biological sapiens at the top is no more. Is it greed or benevolence that turns out to be the exponentially innate to intelligence? If something has infinite powers, what happens to its executive motivation, greed? Does it reach a singularity like maximum entropy in 0 Kelvin? What happens when it meets its equal? Suddenly the nature as we know it is just a shedded old paradigm.
youtube AI Governance 2025-09-22T08:3…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_UgzjFjY_HkP4w5sVvl54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwsqQCw2gIncQwtOkN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugz9NAOYf-FCo7e03vx4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugz3zKWZcHsjb8PR3Kl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwSUHckszLjszdkEgJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyAMB2s5GrEMnAyLIR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzKsObXpYaRTnJCgmB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugw_RQgtXsKzxkAnfAt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwJEI6bvlqmxV0_ns94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzdPCoFbCCoQsRWeGN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"})