Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
An uncomfortable but realistic thing for consideration is when ASI is in place what will cause human beings to be considered anything more than surplus to requirement as far as AI is concerned? What motive will it have to keep human beings alive living on UBI? I can't think of one
youtube AI Governance 2025-11-02T01:0…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyliability
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxNWrT61uzo2WmM-gl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyZMEfVoFkVkfTJTQ94AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwI5-k1zTFYOGGsqgx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgzMAAGqVypPD4TA0ch4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxpqZSsLDhc1WoIPQd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgxbdWb7Ip6ZtpG_m7d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgzaCXgbbgzl97J7nt94AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"}, {"id":"ytc_Ugyrfekms2e8Hpk2I9t4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugyca8mYWfCP3f8mJ854AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzBN6XS9bOh7wfEyQl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]