Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This discussion is all over the place because Miotti is the wrong guest to invite. There's 2 separate conversations here, one concerns the impact of ANI and the other concerns the dangers of AGI. These should have been addressed in 2 separate episodes. KGM is showing his ignorance here, as he should've done his homework! ANI, or Artificial Narrow Intelligence refers to the specialized algorithms currently being developed and used, such as Alpha fold, but also LLMs, including ChatGPT etc. ANI is what people are worried about when the are concerned that they will lose their jobs. This is what the discussion should be about. AGI, or Artificial General Intelligence refers to human level plus intelligence. This is the stated goal of several companies, including Google Deepmind, but several prominent AI researchers, including Geffrey Hinton have warned against this, because as Miotti points out, it could be dangerous for humanity. However, this discussion does not belong here. This is an important concern, which needs attention, so it deserves its own episode, but it's a longer-term threat than that posed by ANI, from continuing to develop AI without considering its impact on society.
youtube AI Jobs 2026-02-18T00:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugx76TynBIPe16bMDad4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzIRffBJsuPnG0Cvvt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgxspczMD4-VG_dcE0t4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugy_Pe-6yltvh3oD1-54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwI-2MF16mkkeiSXt14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxngjT2uaMszD6qbl94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwrSVluFdDmmfonJJd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugxms8sWsu6kvN5wORZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwkzE9idRQkFavlny54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwIeFS7tSlmQlmcdIt4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"} ]