Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
im currently taking an ai policy class in university, and one of the things that we are discussing pretty regularly is the effectiveness of the regulation. The problem that we have talked about a lot is that in the places where this actual real regulation (not just some random report that no one is going to read like california is doing), there is no development on the AI. (ie. the EU (and dont even try to bring up mistral it places last in everything)) To use his plane analogy, a bunch of planes took off all at once, and if any one of them crash, we are all screwed. The decision we are faced with is do we ground our own plane to build the landing gear before we take off again. Im not saying that nate soares is wrong about any of his predictions, im just worried that any chance to get real change will be ineffecitve because there will deffinitly be places in the world that wont ground their "plane", so is it really a smart game theory decission to ground ours? i have no idea -a scared 20 year old
youtube AI Moral Status 2025-10-30T21:5…
Coding Result
DimensionValue
Responsibilitygovernment
Reasoningconsequentialist
Policyregulate
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugwlf2ytab2xv8i3UI14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgzHssYICRCJiDWOUw54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyydrhSRJE9cP4zynx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzfDdBUi-n-XDLG-dJ4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzuhaUqQ1HoO_RSw4F4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugx5h2D0ojYDwU-jVdl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxleDp9-cBbgdtRphx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxKs8NHbh-p4JQlPol4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugw_NX6TOSnAkRz14BJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzpbUfw6_xbYVbkTY94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]