Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@Laszer271 I do understand your frustration, and in some ways I also often fear that Europe is being left behind by China and the US. But I also strongly believe that product safety regulation is extremely important and with a technology that allows doing everything at scales never seen before, this is even more important than for any other product. As I said in another comment: We need product safety regulations, because companies don't act in the good of the consumers without them. Motorcycle helmets wouldn't be safe without ECE. It's just cheaper to produce an unsafe product, and that's just as true for AI as it is for motorcycle helmets. If your start up ideas were really about deciding who gets access to education and employment (that's what the regulation calls high-risk), then I do very much think that they deserve to be very thoroughly tested for safety. What's your opinion on the sandbox proposal, that seems to me to directly address your concern for not being able to get a foot of the ground as a start up. It should allow you to put your product on the market provisionally, without even needing to comply with the GDPR and if it really is a viable product, there should be no issue in getting the funding to hire a lawyer or two to make sure you comply with all laws as soon as your product leaves the sandbox. To me this seems quite reasonable, but I'm happy to hear if you think differently.
youtube AI Governance 2023-07-30T21:3…
Coding Result
DimensionValue
Responsibilitygovernment
Reasoningdeontological
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_Ugzhz2oN9gNF_ycVGaV4AaABAg.AQLRzL8-pDNAQLXk9LnkaC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugzhz2oN9gNF_ycVGaV4AaABAg.AQLRzL8-pDNAQMO-wsnM6f","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytr_UgxBva5G-p5XIxJklNd4AaABAg.9spXMsU-XhW9spfeZHd2T8","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"mixed"}, {"id":"ytr_UgyAb9iMvuBAUxCZWMB4AaABAg.9so0EveHom29so4J1EEtj7","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytr_UgwHMNhRmC39-EkNEEd4AaABAg.9sna71PauBY9xs6MfhitQf","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytr_Ugzz5FgIdge8DTQI0Op4AaABAg.9sn-uEPhbzz9so3ohni7jR","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytr_Ugzz5FgIdge8DTQI0Op4AaABAg.9sn-uEPhbzz9spbpNO3XMl","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}, {"id":"ytr_UgzOfrZBmWVgAwOUG2t4AaABAg.9smjmEl59PB9snNWeHBe-a","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytr_UgzOfrZBmWVgAwOUG2t4AaABAg.9smjmEl59PB9snnBqQFltc","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgzoA6Aq1hHggXaLvvF4AaABAg.ATPRQa_M9xjAToJPGGGEy-","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"} ]