Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Most of these discussions on how AI will 'take over' never go through the step-by-step process of how this would happen. Are they saying that at some point someone will decide that they want to take orders from AI and will obey it when it orders them to annihilate their fellow humans and destroy ecosystems to create data centres? It seems more likely to me that there will be a select group who decide that they want to cull the population to ensure the smaller group has a higher and long-term sustainable standard of living, using AI to achieve this goal, which, although as primal as the impulses of tribes of fighting apes, is horrific to 21st century sensibilities. Another point worth considering is that since technology has developed we have placed human control points within the process. In many fields, the number of human control points has increased rather than decreased. There are many identifiable endeavours that would benefit from the insertion of these additional human control points, but this is simply not possible because of economics and the particular form of automation or standardisation which is the prevailing orthodoxy, perhaps this is the window of opportunity and we adopt AI as and when it appears successful, organically. All in all it seems that technology is all too often being treated as a religion which is either ascribed or not ascribed to. Empiricism needs to be applied to specific scenarios so that we are not relying on helicopter views by people who, whilst highly intelligent, are experts in a limited field, a field which does not automatically translate into the specificities of myriad other fields.
youtube AI Jobs 2026-02-18T10:2…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwhCCvdq6JMV0ogiq94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwGG4uJVF7QEeWmiUd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwYTr0BGgvhUu2D0m54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwNsU1i4npQwI2OXRd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyP1Shl0FobZD06wQB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxGUrWPsPp7VZbxFgB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyMK5wBGO2HLh2fHHJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugzw-2_r86V41jdQoEx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxonJCc9XrH6VCunPB4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgzKZFXFPzAbj8Y6itx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"} ]