Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Elon has put out that AI regulation is required for a while now. But naturally, the aging government never took it seriously. Why bother learning hard tech for a "what if"? Reminds me of how the Air Force didn't take SpaceX seriously until they test fired Falcon 1. Sit on their asses until they're forced to do something about it. As for whether we should stop, my inclination is that this is baked into human instinct. We want to advance tech as much as possible and AGI is the holy grail of advancing technology. Sure, it may destroy us but boy did we create something godly. As Elon put it, at least humanity would go out with a bang having achieved greatness. But humans don't act purely on instinct. Even with self restraint, there's still the possibility of foreign countries that are strong in computer science working on it because as Putin says, the country that wins, wins it all. Finally, I think most people will just hand wave this away because of our own hubris. "How can AGI possibly surpass us in our generation? Our current models won't become AGI. That's the future generation's problem."
youtube AI Governance 2023-03-30T07:0…
Coding Result
DimensionValue
Responsibilitygovernment
Reasoningconsequentialist
Policyregulate
Emotionresignation
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugw3oRP9PQDXZLHdjLR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugwik6vmCqW5XqzbqtZ4AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxsGHn9aayW3kqkJ914AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzugXMZZWJmVWw4DvB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgwSokHd5GRkzVM-8St4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxKXljiGZztFuwJBWB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"mixed"}, {"id":"ytc_UgyuFaAMH5gtiTd_wR14AaABAg","responsibility":"company","reasoning":"unclear","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgyhX7I9dFCAOLmqFSZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzJ1hh5kNmvAMwbl7F4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}, {"id":"ytc_UgxXp1eKy0ZLmsE_eyd4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"} ]