Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think that it's a terrible idea. Firstly, we already have too many rules and regulations and every year hundreds more get added both at a member state level and in Brussels. Secondly, as someone who's worked in IT for tech companies all my life, I think the fear of AI is based more on Hollywood blockbusters than reality. Thirdly, the importance of AI for economies in the coming years can not be overstated. It is the next big thing. As big as personal computing was in the 80s. As big as the internet was in the 90s. As big as industrialization was in the 1800. It will automate thousands of menial tasks. It will increase productivity dramatically. It will lead to new medicines, new designs, better efficiencies and maybe in a not too distant future even eliminate the need for humans to work all together. Is it really a good idea to create obstacles in the way of innovation before we even get started? We are already falling far behind the Americans and the Asians when it comes to innovation. We can not afford to keep doing that. To me, this decision was taken because of the politicization of a misplaced fear, or worse, due to other secondary concerns (same as the push for electric cars that get charged with electricity that is 80% generated by burning fossil fuels, or the decommissioning of clean nuclear generators in favor of burning Russian gas, and many others). I'm of the opinion that we're most likely being fleeced once again, with oodles of shiny new government jobs being created in all EU member states for political cronies, increased expenses for start-up companies and decreased innovation as a result of more and more bureaucracy.
youtube AI Responsibility 2024-09-27T19:3… ♥ 3
Coding Result
DimensionValue
Responsibilitygovernment
Reasoningdeontological
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugyb341DNfgVodViFKd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgxH2b10VYC9A5gf8Ct4AaABAg","responsibility":"government","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugw_FykhxNm4SiiAZXt4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyCMky0sYfAjnsaytl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwhasXwBBEbMsHpxrt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugy0_hqSeXlPPss3CFV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwbdIAAO6v-3lfrl2R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"}, {"id":"ytc_Ugx-M-KsxAQQOrjikAp4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzaboENEo5z9f8N_Md4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgwYh_GqH1d_fscFR1h4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"} ]