Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
AI on the track it's on will single out the tasks that we don't want to do and devalue them to zero, and in many cases we'll just stop doing them. I think Neil's point about value is spot on. So much value we derive is because a human is performing something for us, the task itself has no inherent value. Value is a social concept
youtube AI Moral Status 2025-07-24T21:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugy1s6BYipIxLMj31IV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwRu0IGQC9qpucyDeN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwVY-vSZN5mW618FE14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyRFozK0p_daprxkpN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxRnxJQzziAQ_OHQr54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxvpB8dqMj9HOw-P0F4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyVtedylg92_nSqjPV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwUx_JKljYQR8m9zKx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw-Aee4K22KUXtu9lB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxNcXWtJFs9XGp3TOp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"} ]