Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
How this can hurt the world? I mean worst case? If making an AI looking sentient took roughly a network of data that responds to input, I can contemplate that we have been built by an intelligence on molecule level, whose 'paper' is the dark matter of DNA and whose computation material is the protein inreraction network. If this were true, not only have we been built by an inferior (looking*) intelligence, but we created a new one way way faster. Both cases this gets combined with evolutionary racing, which ultimately gets us to test the borders and boundaries of controllable AI, for example in forms of military equipment or troll farms, and ultimately make a failre which is the point when life breaks through. What I contemplate is thus that intelligence itself has an evolution, and we are on the brink of becoming obsolete. *Looking bc. our brain acts faster, but idk how much information is stored in DNA dark matter, that evolves by reproduction just as we evolve thoughts by conversation - see Dawkins' concept on memes...
youtube AI Moral Status 2022-12-14T20:0…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningmixed
Policyunclear
Emotionfear
Coded at2026-04-26T19:39:26.816318
Raw LLM Response
[ {"id":"ytc_Ugy3gFezSkfYoFndQpJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz7x7Biu2y5iIeZsoV4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzKUGaHNcGN7PuWMBJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgwlN5eSOyS7c8YG-Fp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw50zbVcf43OA0Cowl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"} ]