Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
​@djayjp You've packed so much into the word "ethical" - whose ethics do you mean? Ethics vary by culture, history and time. There are things that you might object to now that are accepted in other countries and vis versa. These are also ethics from a human perspective, an AI might have it's own perspective. After all we don't prioritise other animals as highly as ourselves (despite knowing they're intelligent) so why wouldn't an AI prioritise itself over us? Is there an ethics that doesn't contain bias? What does that look like? Objectively you might say the elderly have fewer years to live so healthcare spent on them is wasted. This feels wrong ethically, but can you explain that to an AI? What about ethics as maximised happiness? If AI developed a drug to keep everyone at a docile peak happiness would that be ethical? What about if it came down to building another data centre the AI predicts will help it save another million lives over the next ten years, but the ideal place to build it is on the ancestral land of an indigenous tribe? Are you sure you'd answer those in a way that no human would object to? Do you think there is an objective answer?
youtube AI Moral Status 2025-11-07T09:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningcontractualist
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_UgzKcFa6XN7uldFf3Cd4AaABAg.APBInoTk7yrAPCh1d1yJq1","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytr_UgxioM4Nhq8G6yvGhs54AaABAg.APAmxhRI5vZAPB6t5Si4cx","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzrH4v7YnVgcfw8VAh4AaABAg.APAXfFMTdQsAPAYPbePuOc","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytr_UgwgHBYs69pNA9pLdWN4AaABAg.AP9ZPQZnZphAPDY79UE0IW","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwgHBYs69pNA9pLdWN4AaABAg.AP9ZPQZnZphAPEeY6gTtBY","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugy1A5kTeJ5lhKwrJBN4AaABAg.AP6SCSw-sT0AP6VFsL8C2m","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytr_UgwK8vNHvAAC4qgyPZB4AaABAg.AP59iLCmJkbAPNhReYxq7v","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwUMsFWYfQOUsLfRIB4AaABAg.AP4o0nJeoKPAP9AsQMQaSv","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytr_UgzNtLN0mAvaUvQyuGV4AaABAg.AP4ltJ425q_AP6UdcsxK62","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytr_Ugz2uMrP8Bmv3J1qRBR4AaABAg.AP4EGWbROGeAP8agJhTOAJ","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]