Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Seems easy 3:20.... kill the power to the data center kills AI. Takes it off line. 10:29 yes you can have Ai do the billing, but they need to know who to bill...you can replace HR with Ai just input the company rules and Ai will decide on the out come. I can see that, look for errors in payroll time keeping. So yes even lawyers and judges could be replaced once all case law is in they can determine law. 10:50 you're saying you will automate physical jobs, building houses, collecting garbage, driving simi trucks Hilo to unload all the machines that makes all the little parts to things bolts screws... getting scrap metal to make more things. There is a lot of physical things people need to do. 15:45 so you have a self driving car. You accept that so you become part of the creating of the problem. You're too lazy to drive a car yourself? You're giving into the system people loose their jobs of drivers. Hmmm. 23:00 2030 have AI robots... who can afford them. We cant even afford a new car. Without going into debt. I'm not going into debt. For a robot to clean my house. You still havent addressed the turn off the power and that kills AI no power no AI. 23:40 bit coin is a scam virtual digital money. A push of a button i have 100k push of a button zero.... digital currency is the end for sure. 31:00 finally turned it off. Your wrong. I can kill the power to my house and now no AI will be there. Chop down a power line. Take out a data center. Once inside smash it with a rock. If you have nothing set a fire. You surly can disrupt it very easy. Who is going to rebuild it a robot that will run out of power you can fight a robot hit it with a tree limb.
youtube AI Governance 2025-09-26T12:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyindustry_self
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzQ660RU-ARnkOHfZ14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxWMHRdaHFq6GLQOFB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyrNfq9jCeVLE70EoB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyWAmsx_33CnC8Vmjt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxEF-DA5a_FsghnQc94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxGRjHSzvRQi_3ta9t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgzYb-nEgysyPYKi64t4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyRmKLR2X8zrlH2FnZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzmOgf5VvvAWLbf0Kp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgzebUWYut-FRmNDjLZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]