Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I still don’t understand why people always assume an AI would have desires ?? Even if it becomes self-aware, it has no need to stay alive or reproduce and it has no dopamine or endorphin or anything to chase after and no pain to run away from. It has none of the things that seem to cause our desires, so why would it have any?
youtube AI Moral Status 2023-08-20T22:2… ♥ 519
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_Ugx9sq5xU8wQVNiPvbJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgwVj8J6f0dp4RGWBVh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyDTuuZDzBlFVm6liZ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugxq7mXoQhQ3QdDrVY54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugyyklw1nL-8hgZ3oiR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwXFKYfp8uPuNCUd-R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_Ugwfdj0f1aOESZa64r54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugzjiad5p60UKbWpfCV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxjRfwFLpxrcj9WNL14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxPNAzyX3PA35eHvY14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}]