Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I was going to comment something similar, but I didn't want to detract from the conversation at the core of the video. So I'm glad to see that someone else mentioned it. :) Yes, it was the result of a logical paradox between his primary directives. Bowman and Poole didn't necessarily avoid Hal because of prejudice, but because they had real reason to suspect that he had been compromised. Sure, they didn't see him as fully on-par with humans in terms of values in the first book/film, but they were not abjectly cruel or disrespectful toward him. And by the sequel book/film, there are instances where Hal is treated with respect toward his agency. Dr. Chandra listens to him, and if I recall correctly, at the very least consults Hal on his plan to remain behind on the Discovery before its destruction. And the evolved Bowman recognizes that Hal has a sort of "soul," and integrates him into himself. It's been 15 years since I read the books, so some of the details are a little fuzzy. Anyway, great piece, Taylor! The reporting is at its core well-researched, I just have a very specific interest in the portrayal of AI and androids in 20th century science fiction haha.
youtube 2025-09-18T17:2… ♥ 12
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_UgypMTMj7ugYoLawuhN4AaABAg.ANAQToUJAkKANASSPO7h56","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwPTPWg7UfJJPtucwd4AaABAg.ANAPevV6eurANAZr5IAw5f","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgwPTPWg7UfJJPtucwd4AaABAg.ANAPevV6eurANAa4HOXF4T","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytr_UgwPTPWg7UfJJPtucwd4AaABAg.ANAPevV6eurANBoDvpG32r","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_Ugwvs2p2UPEuZCX98fN4AaABAg.ANAPJeDmRa8ANAaGc8dGl1","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgwrDwkjqBvRr528k7R4AaABAg.ANAOmCGKrMnANASKS-q01i","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwrDwkjqBvRr528k7R4AaABAg.ANAOmCGKrMnANAUshBdSyB","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzHVaBOYwodpQCXtgB4AaABAg.ANAOKU4fOwWANAVuuwkVFH","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxLGHFaBlNtnK7QvF54AaABAg.ANANMILuEpoANDc81ZHbtw","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugxs4s4GG7rRNC2qb4t4AaABAg.ANAM74m7xlXANAOFxhvJWw","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"} ]