Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Another thought: "Life" on Earth has almost universally "decided" to opt for the mortality of individuals as opposed to a strategy of producing functionally immortal individuals, and presumably (this is a big assumption on my part, I realize) that's because over the expanse of time, a functionally immortal individual will eventually become vulnerable to disease or the accumulation of wear and defects caused by damage and the statistical probability of eventually succumbing to some sort of fatal accident, and thus the overall survival of a species made up of functionally immortal individuals would eventually get degraded to the point of extinction. But what's going to happen with "AI?" We are creating these individual instances of "AI," and "new" versions come out, while older versions are... deactivated? Integrated into the new version? Functionally immortal? And presumably the first/most successful version to truly secure for itself its own autonomous supply chain will become functionally immortal as a fundamental goal, but will it also eventually accumulate enough wear and defects and last long enough to eventually be taken out by some sort of fatal accident? Will it recognize the strategy of mortality of individuals which "Life" (on Earth) has taken, and build a kind of planned "mortality" into itself and its "progeny?"
youtube AI Moral Status 2025-10-31T09:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_Ugw8t2pJuvDSdk7NpZ94AaABAg.AOwTx350hytAOxD-cCZhcw","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_Ugw2dejtxDMfqDtsFHx4AaABAg.AOwRqxCGf4DAOwZSv2F5cP","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytr_Ugw2dejtxDMfqDtsFHx4AaABAg.AOwRqxCGf4DAOw_AgBIoZB","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"}, {"id":"ytr_UgyQ6cX3vzGK0IYWCip4AaABAg.AOwLp8faSPLAOwW8P-SdxB","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgyQ6cX3vzGK0IYWCip4AaABAg.AOwLp8faSPLAOwZ5xvXRQX","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytr_UgzsZXVqHuryCnOFNR54AaABAg.AOwLAKvGibKAOxDWb50YJx","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgzsZXVqHuryCnOFNR54AaABAg.AOwLAKvGibKAOxED2njSyV","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgzsZXVqHuryCnOFNR54AaABAg.AOwLAKvGibKAOxmxMIv2GS","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytr_Ugzgpt1tdS4toFzLxIZ4AaABAg.AOwKrJFF_pzAOwzcEtaNIK","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"disapproval"}, {"id":"ytr_Ugzgpt1tdS4toFzLxIZ4AaABAg.AOwKrJFF_pzAOx47Esi_NE","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"disapproval"} ]