Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I agree with most of what you said in this video, and as largely a hobbyist, I obviously don't have any industry experience to speak on this like you do, but I disagree with your prediction. I don't disagree because it's wrong, but rather because it's way too bold for no real reason. If speedcubing, something I am much more experienced in, has taught me anything, it's to never make bold predictions about the future. It's happened A LOT of times that people predicted a world record would hold for the next 5-7 years, only for it to be broken within 12 months time and then again within another 6 months. Saying that AI will not reach a human level intelligence or develop consciousness or whatever you want to use as the benchmark of AGI in our lifetime SEEMS to be a safe bet, but we don't know it to be impossible. Certainly saying it will be longer than multiple generations of humans is a pretty wild take. I'm tempted to compare our abilities in making these types of predictions to our ability to grasp exponential growth in our heads; you can understand the concept, but that doesn't make it any easier to visualize in your mind or apply to the real world in a way that feels intuitive. I don't know if I am making sense here, but tldr is I agree with most of what you said but just not the boldness of your prediction. Either way I enjoyed the video and also really have been enjoying your content recently. I'm a relatively new viewer. I think I heard your name on a Destiny video when people were talking about PirateSoftware drama and then I checked out your channel from there. Thanks for the excellent content! - Zach
youtube AI Jobs 2025-08-26T15:1… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_UgwyPIzB7oVpIvnH7Ap4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy5ssAdgSGUtwShtz54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy7JfpwiqkKyatxqDx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzIvw9ho9iKcNLTix94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwArFWo_Y6Sjxa8XY14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgweI3dLOgY_nL-_f4B4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxHJWLIsU4SOVUtHUt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzAueJW7sb4GI-B_up4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgxAFBx9TReqzo5-EQF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxxGWzse9hjOFDdKm54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]