Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
People need to separate the whole AI and robot future thing into two separate debates. AI, sure it's gonna be the main tool we use. Robots to perform every task a human can do is whole other thing. 1. AI will replace a load of jobs, those people are going to be looking for work. 2. Robots, as impressive as they are, are a million miles away from replacing (most) people. 3. To get humanoid robotics to the level they would need to be to replace everybody's job, wife, etc. it would require an insane amount of financial outlay into research, building them and perfecting better models, the trial and error of using them in practice, legal issues etc, etc, etc, etc, Which company is gonna have such a long term perspective on getting a return? 4. You'd need robots to be so advanced that they even smell and taste like a human (as in if you were to get up close and personal with them). Otherwise most rational humans wouldn't be able to trust them enough to allow them to do things like take care of young children, cook food without poisoning them etc.
youtube AI Governance 2025-12-04T22:2… ♥ 1
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyindustry_self
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwtuPSS68n9ejX0-E94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxAzkoG2Gg9OZ5HTFp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgxuhyrR-hf1LTdS6PN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzJDgWdlFTEtn1n6Yh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyOp2EoRNdiQpOCVw54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugw96knYr3zb5LXFQ7h4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwuiFxnixy2hgwgRFl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"}, {"id":"ytc_Ugyq31dfpC7u6Kc3WFR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxpWNqkA0LmCFB1sS14AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyAqHfs1mOAb8wYV854AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]