Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
AI is exciting, and it’s also understandable that people are nervous — because we’re dealing with something that affects work, identity, meaning, and human connection. The way forward isn’t to reject AI or worship it, but to ground it in human needs. A few thoughts: 1. Philosophy, counseling, and meaning are inherently emotional and human fields. Even if AI becomes extremely capable, some people won’t want therapy from a machine. Human presence, empathy, shared lived experience — those are not “software features.” So certain domains will always need human hearts, not just human oversight. 2. That doesn’t mean we freeze progress — it means we shape it. We can push for legislation that says: Automated systems must have human oversight. Companies can’t replace entire workforces with black-box systems. Humans stay in the loop for judgment, safety, and ethics. 3. We don’t need one perfect idea — we need millions of small, good ones. No single person can “stop” the direction of technology, but if enough of us propose safeguards, values, and practical solutions, then we can guide it. Hope doesn’t come from ignoring reality; it comes from participating in shaping it. These are just my pieces of the puzzle — if we all contribute our own, we still have a real chance to build a future where technology supports humanity rather than erasing it.
youtube AI Governance 2025-12-09T14:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policyregulate
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugz2aG7N3OMQ3Rntjwh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgztS0q8_1H6nvUNjLd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxGTEpfVBbzTPfVp_Z4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwQrQ96usVKwd9P00p4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugzi0cA4lSkk8w_dZox4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzTsObAhGsVY2Dk_IJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxpIrXnQ63fSjI-RtJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgztULkQzdrjx6GpvZ54AaABAg","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgyPTmGp6f8gH9Md3lZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugwu44HtihKP4y8xWRN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"} ]