Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Why Hinton keeps saying “train to be a plumber”? Physical dexterity is still a moat. Current robots struggle with cramped spaces, leaking valves and one-off parts that even experienced human plumbers find tricky. Until general-purpose humanoid robots improve dramatically, jobs that mix problem-solving with fine motor skills in unpredictable real-world settings—plumbing, electrical, HVAC, auto repair, many building trades—remain relatively insulated. AI is racing ahead in “routine” cognition, not routine manipulation. Large language models can already draft legal briefs, triage support tickets, write solid code and even spin up entire software agents. Their learning curves are steep, and cloud scale means a single model can clone itself millions of times—one reason Hinton thinks most intellectual work will become “mundane labour” done by machines. So where are the safer (or newly valuable) zones? Skilled trades & field work – plumbing, electrical, solar install, precision machining, heavy equipment maintenance, physical security, on-site inspection. Hands-on care – nursing, elder-care, early-childhood education, physical therapy. Robots help, but trust and bedside rapport remain human-centric. Systems oversight & compliance – AI assurance, model-audit, safety engineering, policy drafting. These roles blend technical literacy with governance. High-context creativity & live performance – concept art, brand worlds, experiential design, live entertainment. LLMs can generate options; people curate taste and negotiate business constraints. “Full-stack” entrepreneurs – small teams that wield AI agents to launch products fast. Founder risk is real, but the leverage is unprecedented.
youtube AI Governance 2025-07-13T11:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgwI8401QxQtTVy7zYd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgypXVPlfnDTfb_cRL94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxaBNGfZmBWGR4fHXp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzC-zxMMArpFOloiAZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugz5lgg4rDgNbxm5uQF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugyd_8xpVdVkQLYjoMt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxP8m-6IR1DRdRcoe54AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgysoBVVJShlqXzrSkh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgwuQlzIqwyKpaMMrXJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugwq2y586MupWhvbGCh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"} ]