Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I always default to this same cynical view. Maybe Altman had me fooled but how he portrays himself got me thinking. how would a selfless person act differently? If he is actually as afraid of the scifi AI doom as he claims, then to be the hero his best option might be to find out where to draw “the line” and position your company right there so that you soak up as much oxygen (capital) as possible with a first mover advantage. Then go do interviews 5 days a week, testifying to governments, etc to position yourself as humanity’s savior from the roko basilisk that The bad guys would create if we don’t first! He is wise not take equity in his company. In a room full of virtue signaling narcissists, he probably won a lot of people over with his shtick. If the singularity is really happening, any kind of PR that helps position him as a lightning rod for talent would be worth more than making a trillion dollars from equity in 20 years.
reddit AI Responsibility 1684299174.0 ♥ 107
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningvirtue
Policyunclear
Emotionmixed
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_jcohqyg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"rdc_jkgtj4r","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"rdc_jkigvhw","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"outrage"}, {"id":"rdc_jkgoiiu","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"mixed"}, {"id":"rdc_jkgr4ij","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]