Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The problem is that the genie is out of the bottle and there is not just one AI, even if OpenAI currently dominates the market. Even if there should be such a memorandum, I myself have such an AI on my own computer, not nearly as powerful as GPT-4 but I am also only a small software developer with a very limited budget. And why would I, if I were a malicious character, be interested in such a memorandum? What should prevent Russia, China or Switzerland from developing their own AIs now that the technology is available? I see the dangers too, I sign every word of this open letter, where Musk is only a marginal figure by the way. My own small experiments have shown even scarier results, so much so that I destroyed the first version of my own AI. The human society of 2025 will not be the same as it is today, and whether there will still be a significant human society in 2030 or even 2050 I think is questionable, to say the least. Since 70 years authors and scientists warn about this day, similar to the climate catastrophe they were ignored and defamed as cranks. Congratulations mankind, no species is so good in putting its own existence again and again on the risk. At some point, this gamble will come to an end. This text may have been written by a human being. Maybe not...
youtube AI Governance 2023-03-30T01:4… ♥ 6
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgxaCDLqoV3Blf3rHKJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugy6sORpZ-8inVK8k1l4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugw7_KSv6JgUTpvrMhd4AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"outrage"}, {"id":"ytc_UgxstG7pZixB4oJFGbV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxnK0khNG1q4uUQhZN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzBim0lzz1951IYIOJ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgwvtUjccGFfPIV6nwZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzcE7kEUWSfQQYndD94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwxFJhzSRmDXTdW_jl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugz9bwpmPf9H9RVCMNN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"} ]