Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Interesting conversation! This is Eliezer's bread and butter, and he always has (or comes up with) illuminating examples to illustrate his points. This isn't Wolfram's field, which is fine, but unfortunately he too often nitpicked in ways that showed that he was getting the wrong end of the stick (e.g. Eliezer describing Stockfish's chess-playing as an example of how he conceives of wants, beliefs, predictions etc. in the context of AI, whereupon Wolfram got hung up on whether the mechanisms that represent these wants, beliefs, predictions etc. can actually be identified in the scary types of AI too, which doesn't matter for Eliezer's point; it only matters that they're in there somewhere). David Deutsch I think would be a good "opponent" for Eliezer in such a debate since he's a high-level thinker and won't get stuck at the hurdle of accepting agency in the context of AI or on claims of value relativism. He's simply an optimist in opposition to Eliezer's pessimism.
youtube AI Governance 2024-11-12T22:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgzZjk-dccsmE4r1CbF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzN-dfsvH0_3hTj87Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyBrsbkOUjTW8bZHgt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugy2Qq17d-rNew-K7hJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzmT97vvYHntMl9Y5d4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxJEWyj3-VMGPf5UR14AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"}, {"id":"ytc_Ugw75_NQVGIiLn5jb9B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwIcUBDH-ncdjtaAw54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgxkP3JTDL_ibbhpF8V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"frustration"}, {"id":"ytc_Ugz4NAtgI9yTWXsehN94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"} ]