Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Tech companies are foolish. With GAI, even if benevolent? 70% minimum of people lose their job in a year or two. Even labor work machines will do and machines made to fix each other make mechanics irrelevant. GAI can make machines create other machines to fix machines made by machines already made. No jobs, no money, and GAI makes money worthless. Having a trillion dollars will be meaningless. Having gold/etc will be useless. The only outcome that humans are not in danger is where income no longer means anything. Creating GAI is a terrible idea in any way you put it inside of current forms of economic and government forms. The era of social media influencers is disappearing in front of us too with people preferring shorts made by AI over long form content by people. This doubles as being a problem every year. The end goal will be nearly zero interaction between real humans. There is money in AI. There is no profit in GAI where it can solve nearly every problem on Earth, or at least in first world nations.
youtube AI Moral Status 2025-11-04T19:1…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwYMbnDM2VGwox_aOJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxYOycNbQ4IvjMtNMV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzJTMhPNMrUXQM_uaJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxrgXxoJVinsyrkMsV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgwbeUYUR7QWsH2Lz4t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxG5K3a3A25sztovYJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyhsNprkIeiJxp3n3x4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwLBIjK-OVx0rpwkFF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzuvROVe4G8ECnVlfd4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyP_aecgiga2Y5SZLt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"} ]