Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
What I'm worried about is the people who own the AI will just support the .00001% of the population who are the rich and elite, and just let everyone else die. It's much easier than figuring out how to keep 10 billion people happy, fed, and entertained. People like Elon Musk keep saying wealthy income, or the equivalent to it, will be given to everyone. But I don't trust him to actually do it if he's the one who reaches SI. My other question is, who's just going to hand out AI for free? Will the US hit SI and provide a fantasy life for Americans while the rest of the world still has a labor based economy? I do disagree that silicon is faster. Look at all of the training, HUGE amounts of data that's needed to make a robot that can walk. It's taken decades. But a human baby can learn to walk. I learned to throw a baseball as a child. Our brains and nervous system are much better at learning. The only place these systems are faster is the consumption and mastery of information. But that's done at great expense. It's still much more cost effective for a human to learn.
youtube AI Governance 2026-02-13T04:3…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningcontractualist
Policyliability
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyH1bGBD9nPc7rTkHR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugy62OWu_wNCjN6dset4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy17WOiUWz8wvEqCnl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwyJc0j4nZM9N9XKe94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzHSZ17wp3_atqxT854AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyeG_VWwRj2vlHCTEF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzeW8Qu_ASqx1qg5f14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgzEpqLPRkn_GzWtyOt4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgwqnGP9WivK7RtQXiJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzITUVeOxzo-iiePZJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]