Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
AI if it became a problem it would be stopped. A computer or AI will never take over 90% of jobs. They will never take over the labor jobs because there’s way too many variables. What would the price of that robot be? Millions! Think about how robots right now are not replacing 99% of jobs. The reason is they can’t and the cost. They never will! Americans wouldn’t put up with it and it would be something dealt with in a presidential election. Let’s just say 50% of jobs were taken by AI. If so you would have people loosing everything and starving. These normal people would do anything to eat and even turn violent. It would would be far more complicated than any military could ever handle. Not only do we not have the robots yet but not in 99% of jobs. Only in the car industry they use machines to build cars yet look at all the people that still have to be employed to build cars. I want to see AI do ALL plumbing, electrical, accounting, road work, etc yet they haven’t even begun to start taking over these jobs. AI is impressive but it’s far from human. We also wouldn’t use AI if it caused the self destruction of our life and the world we live in.
youtube AI Governance 2025-12-18T02:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugz1rCJZ2FGf5qS0cnt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyNxqOspJen-RoRibN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgysYh1iYhV10m_5F-Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwBilyK3FlGmCLnBrN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyL1FqaLeCg119tIHd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxNnrqVvgiJTp0pktl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugytplouw8FoOJJGZul4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgzOOLk6XJbCVcB-wut4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwA25GxdlP9fsnuyKh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugzr8FgurZfNxLT05394AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"} ]