Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
AI won’t directly take over most jobs. Instead, many jobs will simply stop being profitable enough for people to keep doing them. For example, in China there are still workers folding packaging by hand—even though this process was automated decades ago. Why? Because in some cases, a human worker is still cheaper than running an industrial machine. Those who lose their jobs won’t all be “jobless”—many will shift into managing and maximizing their own assets, or they’ll become AI managers in their fields: lawyers in the legal field, doctors in the medical field, construction managers in construction, and so on. There will always be a need for a human approval mechanism. A friend of mine works in rail maintenance. He is responsible for giving the final clearance on tracks. If he makes a mistake, lives are at stake. An AI could prepare the safety protocols for him, but in the long run, society and governments will always want a human being to hold responsibility. Remember this: AI is a tool. It is not legally accountable and most likely never will be treated as such by society. If a hacker launched an attack using AI, we would never put the AI on trial—even if it acted autonomously in some way. The accountability would still rest with the human.
youtube AI Governance 2025-09-06T13:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwNDq_tsMSqMvBa8Xd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxZB9QShw8WCPCFQr54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzUounSPqN01Ws6ypd4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyoImCtFtbFgVPTpol4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyFeimDESzg9Ui1cz54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw7t5l3WFr_zidwj_F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugwhdldj_W4elTlcjq94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgyEISYygVHkuVPo6wB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwkWKMDqDZ4FyTuqdh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugzhjyc7wV2DTl8hRkJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"} ]