Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Hell, let's go a step further, how would an ai actually be considered a slave? The reason it's wrong for humans to be slaves is because humans have many limitations and are very limited in what we can do within a 24 hour day, where we need things like food, sleep, time to unwind, etc. If we're working 24/7, we'd die, and because we're incapable of doing more than 2, maybe 3 things at a time, need time away from work just so we can have time. Robots on the other hand have no need for entertainment and are specifically programmed to do specific tasks more effectively without need for rest, food, water, or even shelter. AI can do those tasks while also doing 1000 other things at once, meaning they too aren't truly being enslaved because the ai program can do the tasks it needs and many other things it's not required to do and yet never need a break or sleep.
youtube AI Moral Status 2023-01-02T11:4… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policyunclear
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[{"id":"ytr_UgwBelcJQkNkU8VDmwp4AaABAg.AKizI7wN9FkAKj1UZryLnX","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgzwvbdMv52wsfGCDrR4AaABAg.9SgQO4CL3kW9cehGL-gQ9f","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgzwvbdMv52wsfGCDrR4AaABAg.9SgQO4CL3kW9cepKE1HhWb","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugx0fNPuglmAB4K2WiZ4AaABAg.9NI_DN6BJa89RnStFaiIEe","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_Ugx0fNPuglmAB4K2WiZ4AaABAg.9NI_DN6BJa89TFfXnxmFON","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgxL67iRnZmsn98cJg14AaABAg.9BgaUGb-Gl49cXgH5MLWr6","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"}, {"id":"ytr_UgwcHhN1oBEmIXh4vz94AaABAg.9B6Moarkoeq9fugCL6RapT","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgwBUyKtpakcyl4wwIl4AaABAg.9B1f6SBE-ye9Bt6OeRbHWu","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgxrNa5lwCh6-bX0VnZ4AaABAg.9B0vmq5NI0U9HVtjXNXt4Q","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgxrNa5lwCh6-bX0VnZ4AaABAg.9B0vmq5NI0U9kNqXqZwvPh","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"}]