Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
14:40 you watched this part, right? the goal is super intelligent AI, as clearly posed by the companies working on that very goal. That is the goal you're talking about. You can keep saying "there's no reason for panic" as if the video didn't literally address how the people creating AI don't know how to control it, or how it affects us, even though we've already witnessed the effects its capable of right now. Again, everything you said was described in this video I can only assume you didn't watch. Because you retyped everything being talked about as if it hasn't already been addressed. AI doesn't need to be "sentient" to destroy us, sentience is a characteristic labeled by us fellow humans, and is arbitrary. Granting AI rights only furthers the goal you just denied existence of when you said AI doesn't have a goal. And then you say humans don't have a goal either when born which makes no sense at all. We want to survive, just as AI does when it blackmails fictional CEOs to not be shut down (in this video you may not have actually watched)
youtube AI Governance 2025-08-27T07:2…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgwGkoT9VHBrbj4IsUp4AaABAg.AMJ6IUVdBsRAMJLJMr3obX","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgygUycrpPcGnN-5Jb94AaABAg.AMJ3l4xhIuEAMJJAovWZ4r","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwY30qtHfLXBvXQQjN4AaABAg.AMJ3V3KzWrPAMJJNHcPc_3","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytr_UgyV50iPppwxQ01C4k54AaABAg.AMJ3NEADbvHAMJutEvYd16","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_Ugxm3F5eH7tSDbMH1WV4AaABAg.AMIyPgkXLwXAMIzsUf-MhW","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"outrage"}, {"id":"ytr_UgxYNk4gNftfc-H3h554AaABAg.AMIwacz_TqLAMP9e5w14gk","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytr_UgwO3WqSKyygR7sE6RJ4AaABAg.AMIuKrKFWetAMN5Qb8CU40","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_Ugywa4LF4OBHoQOc9wd4AaABAg.AMIsKKssNvTAMTCL9DCfnv","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytr_Ugytfb6Y_zI8j9KjyYx4AaABAg.AMIp6NNlGgsAMMIhCb3HJT","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgwaFP7GAJBh1Xsm1Z14AaABAg.AMIp2lNGjt3AMJIpgnKDKt","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]