Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Okay, I have a major and very practical fear, which no one has mentioned. And whilst I haven’t listened to the entire interview yet, nor can I see this subject mentioned in the chapter headings. I’m commenting to make a specific point (a major fear), but I have to be honest, I wish we hadn’t even had the idea of getting computers to think for themselves. My friends at university in the early 80s were doing it as PhDs. And it sounded an interesting idea. (My career was in trend-spotting and branding.) So here we are now, seemingly with AI with no concept of the important human qualities of morals or conscience. I totally get the medical and scientific positives and zippy problem solving abilities. But I also see the negative potentials of that shared knowledge and understanding base. Sadly, it all just occurs to me as the modern version of humans looking for cheap labour. And we know how well historically that’s usually worked out. This time, men in suits are looking to get rich by coming up with sexy tech to sell to companies and the public to ‘make life easier’. As usual money and myopic greed. But this time, that basic human drive could get us all cooked. It’ll be bad enough when AI can turn our cars off en masse, so we can’t move, or when it starts firing rockets at nations without our instruction. Yeh, doom, gloom and cynicism, and I don’t have a solution. But bigger than those possibilities, I really do need to draw the AI experts’ attention to an area which needs to be a crucial part of a solution or at least safety valve to stop AI eliminating us at a whim. I’m almost fearful of mentioning this in case AI clocks it and finds it interesting. But if I’ve thought of it, I’m sure AI has worked this out long ago. I’ve lived in 26 cities in dozens of addresses in numerous countries, and all the people I met had one simple, vital thing in common: we’re basic meat sacks, which totally depend on water to survive. So, to eliminate or at least control humans, AI just needs to control the water supply. Most civilisations, societies and communities depend on vast, stored resources of dammed reservoirs for potable, safe drinking water. And they’re all computerised. So all AI needs to do is turn the taps off. I imagine AI finds water pretty useful for its own cooling needs, so even before we became a burden, irritation or rebellion against it, we’d already be competition for that precious resource anyway. Even after searching for bottled water (I’ve had to do it after a huge earthquake), and using rivers, wells and Tain water capture, humans probably would have trouble lasting much beyond a week. I hope someone good with computers has thought of this.
youtube AI Governance 2025-09-06T13:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyban
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwEvVROQgICGKrI5AB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugwl6vQw9-BBL9R2TVJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwSSFhP_idXulWKj5R4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy57DfOQ-kfZFyghCV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgzCtJZyqvEWGK3SUZl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgwHFR2OtXXjjTyDbzN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugw-1pNaLqhUVHk4_814AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyKBFmqN13_gGVy7h14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxaCbx-mA7hynzl25N4AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzpWQKjj93gPUIvlhF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"} ]