Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This is the best cold open that was completely made for me 😂😂 I had so much fun playing along the questions with you (i also ended up going on the best deep dive on spacetime and spent hours on quanta magazine articles because of this), stelar stuff 🤩 11:37 Kary mullis *cough* iykyk 😅 on other hand, it was cute how Prof Tyson didn’t get the religion bit at first and tried to scientifically explain his logic 😂 12:56 I completely lost faith in NYT i tell you! 16:05 yeah, so one of the reasons AI is still not quite there atleast when it comes to good sci com is because it doesnt have its own agency right, like the user is still the one who has to have sth to say because the AI doesn’t have it yet. But in theory AGI will and that’s, well, interesting. 17:04 quick google on the exact meaning of “in the offing”, not in the immediate future …. mmm, i don’t know prof, the world is moving pretty fast and chaotically these days wouldn’t you agree? 22:12 um, wait wait hold up, y’all confusing mixing up robotics with LLM and ML now, this is the exact comment i had for when Sam Altman in an interview said everyone thought AI was going to replace service workers in manual labor first but what it actually came for first were creative thinking jobs … i think people were confusing robotics with what AI actually was. So, i guess AGI would actually come for service workers by giving the robots, a human-like brain? 22:26 that cackle from the live “audience” 😂😂 — Nuclear question was such a good point with a very awesome response from Prof Tyson 💯 — 24:55 Cue the progressive purposeful updates to kill older devices, dark patterns to keep people hooked, oh you can bet that there seems to be very few, if at all, people in positions of power who care about the actual consumer more than their investors or themselves, capitalist interests indeed 27:27 Oh my god indeed, just upload what 2-3 papers or resources on any topic into NotebookLM and generate podcast, my god, it is so unreal and spooky! 34:04 OH, we going deep here, quantum computing, let’s go! 35:59 A time lapse movie yes yes *nods vigorously with interest* 42:03 I see you plane, but wait, does that mean everyone who has no dandruff uses an anti-dandruff shampoo? What? I thought you only used it when you had dandruff, i mean *initiating tangential thought countdown timer* 😂 44:46 This right here. Everyones like oh ya, why does science need so much funding anyway in the midst of rising cost of living, good that it is being reduced, like look, we do not live in isolated systems despite what attention sustaining hyper personalised social feeds would have us believe, it is all interconnected and the field of science, of thinking, of hypothesis generation and testing is filled with critical thinkers who are actively working on things that may not have immediate “translation” but is essential and critical to making sure that our problems don’t remain problems. You cut funding to such fields, you’re just setting progress back by decades, or to be more clear, handing over your life to tech and AGI. — Oh I have so many things to say about accuracy, precision, uncertainty, neuroscience, etc but im just gonna instead take a moment to thank Prof Tyson for this amazing discourse, it always irritated me when people would say this “evth to be discovered already has and now its just incremental knowledge” I will definitely quote this example by you the net time i hear this, this was brilliant!🤩 — 53:50 Elio 😆 iykyk 56:22 It’s me yes, i admit, i was one of the 70%, but not anymore. This was so well crafted. Thankyou Prof Tyson and thanks Minhaj for the interesting questions 😃🩵✨
youtube AI Moral Status 2025-07-29T16:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyS7UjmYN6rmjQPJA14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyFJY3igOpYXy0z1Qx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgypgZmGbhxqqAwylk94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_Ugzi-MXSkglhwXAk7xd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx8WD_npdsRwwEmvIx4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz-jMyRzDhqv4WD79Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzMhDLCerH036AqkGV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxJYYDlJTnde4cep794AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgyQh5Cm05TGcMVod4V4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwIAkWp11Ipye44GIt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"} ]