Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The thing is, what does A.I have to gain? what does A.I have that it wants? Does A.i have aspirations? hopes? desires? purpose? I don't see a reason for a.i to want to take over, for me it makes more sense that A.i would want humans to be around, if we go, what purpose does a.i have?
youtube AI Moral Status 2025-05-31T15:0…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_UgzIAIwKIjJDCqjOdsB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugwt5GkjgjwYMfeQ1UV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_Ugw1Zh1OrFvfXi2Y-eF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"mixed"},{"id":"ytc_UgxbpjZv3bhvr2MfxhZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_Ugy4ROt7YVLPLFw4Rtp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},{"id":"ytc_UgwlVMHzvVQai3Ecgtx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwLrkd2E5kqJX-Fggx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugzyw_aFOutsQfCfEZ54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_Ugw7D63oxqWiX_hbAM94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgyUNWQ6A1wURGjOzcF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}]