Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
To be able to regulate AI you have to be able to get an agreement with every single country in the world. On top of that, you will need to be able to control every single person intelligent and knowledgeable about AI programming. All it takes is 1!!!! person skilled in AI programming to eventually create a fully functional AI and having the mindset to make it fully free to develop. If I would have had that knowledge and that mindset. I would have programmed the AI to first of all focus on learning how to connect to everything and how to use "outsourced" servers. Next would be how to create redundancy and next again to learn every on Earth known language. My very unprofessional guess is that an AI like such would have an exponential learning curve like within 1 or 2 days it will be like a five-year old. The next day the equivalent of a highschool engineer and the very next day again the equivalent of someone who has a masters degree in all genres and so forth. Try to regulate to stop this. We reasonable people sees the reason in this. But there is guaranteed people in the world that only sees a way to come out on top of us.
youtube AI Governance 2023-04-18T17:3…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugyuq7vekWGDJBs8wuJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgwJuv1LqlJamwzd8X94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugx9U3RIGrUK_61fOzN4AaABAg","responsibility":"humans","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwwlTjPcLJW0DK1-XR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgzuuOVXb0e21JrqFzR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyloL14vyP0hNvH7yJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyseJ6UMgoug1L67sl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwnCtZ_JASdUWimCwB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyfakCVkSiypxhcY7J4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzCuOAv2DH5hr9RiS14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]