Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Or could it be that robots have more right to exist than humans due to their superior ability at a) information processing b ) physical power / robustness BUT we humans decide to become robots, androids, whatever - but with gradual replacing of our carbon-dna-based brains with electronic/digital-based ones? Any humans that refuse are to be restricted to an area of the planet (ok, our own solar system) because they'd have inferior survivial/thriving ability vis-a-vis these human-turned-robot? Just throwing that out there - for that seems the logical conclusion to the "survival above all" assumption living things have. If you don't agree, then that implies some things are more important that "survival at all costs".
youtube AI Moral Status 2018-01-17T00:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningcontractualist
Policynone
Emotionmixed
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgyKc2VG12WJKMBHRFZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwD4tIZv8o18ezmRV14AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzzNBc2crh9fGhNvWN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyDCSRmlAlGz7A7W1p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzSSEV02wjwwIO_RGN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugw_4kBgq2BY87aGGx14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzvCX4-v23Bij5jynl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxqiDGZleUYBetiwYJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyC3D6oLzlalKr5RXR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzNQ9VYKtNTLRqaAzx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"} ]