Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Please, please, please for the love of god, satan, vishnu whatever you believe STOP calling it artificial intelligence. It is Machine learning algorithms. The general public conflates AI and ML so much that's why the term Artificial General Intelligence, AGI, came about recently. Same for UFO/UAP. By calling it AI you're reenforcing the idea that there's some even basic level of cognition or thinking going on. This is a pattern recognition machine, a highly advanced one but still not even approaching a child in terms of reasoning or understanding. Its relying on training models rather than creative thought. It is NOT Hal 9000 or Skynet. Not saying we should trust it with nuclear launch codes, far from it if anything knowing there's not even any reasoning behind the machine yet should make us inherently weary of it. But understand what you're afraid of. Saying it can pass these standardized tests means nothing as all previous questions and answers are online for anyone to find. Again AI/AGI inherently means some base form of sentience, or thought. Which even GPT 4 does not have. It can hallucinate and make up facts. Dont believe me? Well a lawyer recently had GPT4 generate his case and GPT cited totally made up cases. When the lawyer asked if the cases were real GPT doubled down and said "YEP TOTALLY REAL BRO" that lawyer is in a lot of trouble right now. There needs to be a higher bar for AI/AGi than a standardized test. We need a real life Voight-Kampff test
youtube AI Governance 2023-07-07T04:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwrdAhxkSZHCzHYrul4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz1PHIe9gVwDVkm_OJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugx7VmztVw6GnX5BDm14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugzwn4GHEgBlJS3Ix5x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxtjzDVy1954nSTG0h4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"fear"}, {"id":"ytc_UgyfnyoyJP0-aGKoclB4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgyrH9JRJM1ZWy3mxvF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx07S6VM1AceMuM3CZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgzWnHdfHKGROILV6Cd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugwnld1lgSVe0eJNzrR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"} ]