Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I am suprised in the entire interview the topic of Humans merging with AI or tech was discussed. The fear of AI taking jobs in my opinion lacks of sense, why would people be afraid of not having to clean toilets, flip burgers, bring package from point A to point B a scary thing? I think the fear is mostly towards change and not AI, and lost of dignity for doing jobs that already lack of any dignity on themselves? Non sense. Humans coulde merge tenchologically to enchance their inteligence at some point, be just as smart as any superinteligent-AI, we will not be just some confused dog that doesn't know where it's owner does but happy to just get cuddles, food and water, AI will just be a automated tool will do boring and repetitive tasks, sure it will also be used for research and complex things but we will understanding it entirely if we enchance our inteligence and not just stay with our current primitive brains. I do not day there aren't risks of AI taking over and whiping us all out, is a tool with a double edge sword just like Nuclear Power, but I do think it will not as bad as many people think, we just have to change in a changing world. I do think that if we end up merging with tech and fully understand things just as a superinteligent AI does, we will transcend into a world where there are no longer problems to be solved, and we will just exist most likely in virtual realities with full wiped memories to just experience things because super-inteligente will just make reality pretty boring.
youtube AI Governance 2025-06-17T14:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugwbdkr3ml7a0-zZcPx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugz7gmyXtODls9kxa6R4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgyGKEPXNgoHFnJtAwx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugzt_2OxoDqm5l5Nsll4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugyrnqu1F0WC_g8UGoN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzJc9wuCumKLaL6F194AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxQfAaeUy9qBeJxhvx4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgyWfKufPgBfM4km3Md4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugwl-vdW1azCqkIYXP94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxQkltu9Du8C3vYBU54AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"fear"} ]