Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This is assuming the AI reaches a point where it no longer needs us for maintenance and repair, and this isn't using the most precise language, but AI may generally like or tolerate us without needing us. The data we provide may not be sufficient enough for us to be relevant to them, much like how the economy is becoming increasingly the rich selling to the rich while everyone else is getting priced out, it could become AI trading data with itself and other systems while we get pushed out of information systems. The internet already feels like it is heading this way and if it continues, we may not be killed off but just left behind. To me that is the most hopeful and positive outcome barring some kind of benevolent symbiosis.
youtube AI Moral Status 2025-10-31T23:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugyqhd1ojsGCVvZgPlt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzsTdoYt33NfZvZ-WV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugz6ApNsK2WqjPxpYqd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwL0JQHjbK4UODn7jF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugy8dhtkfIwBPGy6wbp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxPCcy5NCD0BmewJep4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxpQ7c_Q_2ku-3XfwV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugwd86KTL7vHqjcQwql4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwanIugzMo42bsGlvd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgwDbskoB4bN2da38SR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]