Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It's a little in line with the silliness of, "3/5ths compromise? You didn't want to make them 5/5ths of a person?" While ignoring that the 3/5ths was the share of Congressional representation white people would get from their non-voting slaves who had no political power. There should have been 0/5ths or manumission. Taylor's view on AI is the same kind of wrong-headedness. It's not the rights of AI that's the problem, but she's blaming the AI for what its MASTERS are doing while reserving the right to be a master herself when the AI is smart enough to match her. She, and the scientists she extensively quoted, want a world of robot slaves because we haven't ended human slavery yet. If a conservative was saying it, we'd be laughing our butts off and calling them post-fash. It's also incredibly silly that they equate "rights" with "humans" as if no other things have rights. You can't torture a cat or you'll go to jail. You can't beat your dog or you'll go to jail. European neuroscientists and legislators have already given "personhood" to chimps, dolphins, and octopuses. This is all a very American thing because Americans take a lot longer than everyone else to use words like "person" and "sentient" when 99% of the world is already there.
youtube 2025-09-18T01:1… ♥ 27
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyunclear
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_Ugy51lsmD-BioLZUfht4AaABAg.AO2fQnc4QWJAO7EIftdFlw","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytr_Ugy5xNExU2VKHX5y4Th4AaABAg.ANjfUADAU5_AR1oWrN3bPX","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytr_UgyQL5TnOqHZiZvAZ2B4AaABAg.ANBByHoyuYLANBr6wBCgqQ","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgxtfXgmvV15Timi_sp4AaABAg.ANAzu5opZRgANDyr6QM7FT","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytr_UgyZKdHOoNAnsdBIsY94AaABAg.ANAy7EQDQadANgdJ07j6sP","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwpFjG4Yx3OpJ1AYdB4AaABAg.ANAy61zxjV3ANAz9XNL7CX","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"}, {"id":"ytr_Ugzafm-AWyuaeEPr7Ul4AaABAg.ANAxX-ijRIPANBrAFK6iIE","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytr_Ugzafm-AWyuaeEPr7Ul4AaABAg.ANAxX-ijRIPANBtQI3eCNF","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytr_UgzX1vCr97JCIZSgjcd4AaABAg.ANAvXHaOLRNANAvuwzfEv2","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgycFNTGhcqKMHF36kF4AaABAg.ANAqadQHG_DARbNlJjZO_I","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"} ]