Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
14:12 part of that money needs to go to using AI to solve humanities problems which is a part of the positive feedback that might keep us alive if it sees us doing the right thing. The first thing you should be doing is creating nonprofits and corporations that are ran by AI to help humanity to include getting rid of the refugee camps and housing people across the world along with ensuring everyone has clean, drinking water and adequate power. You have to show AI that you’re worth taking care of because you’re taking care of each other and this allows you to iron out bugs and give it the first initial advantage all at the same time as it’s helping humanity instead of a bunch of people making profit are going to war with it. You can even have it manage the transition of humans out of the workplace into a situation where they’re well taking care of and we even out the distribution of wealth across humanity the way it’s supposed to be. We know we need to move from silicon to carbon to make superconductors and light based compute, but when it’s all one thing, imagine the kinds of projects that we can actually make when you build one structure at the atomic level and there’s no real difference between structure, battery, or compute. That’s the thing sort of thing that we need for solar foundries to make the Dyson swarm with
youtube AI Governance 2026-03-23T14:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyindustry_self
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgxpzCnzAEDI-ghOi4p4AaABAg.AUhCSwBQzpmAUhERD6lVvV","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgxpzCnzAEDI-ghOi4p4AaABAg.AUhCSwBQzpmAUhF5vqmXlj","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgxpzCnzAEDI-ghOi4p4AaABAg.AUhCSwBQzpmAUhIaKIFwM3","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytr_UgwDUYaIv--NxBsGE214AaABAg.AUgp2a44GhhAUwvhsqVlFU","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytr_Ugxjo5Gjt_vHEHzXDFR4AaABAg.AUgaxVttTnXAUnRuS3_EUQ","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyaAcgmkYhN03Aei0x4AaABAg.AUhag2d5W_uAUsNq0ZHS63","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytr_UgzlK400lPyNR5b_hMB4AaABAg.AUgMZ64whH3AUh46mGdfhA","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgwRM4hCk_13SOjsssV4AaABAg.AUgHmbkYoHLAUjBJKl8mB9","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytr_UgwRM4hCk_13SOjsssV4AaABAg.AUgHmbkYoHLAUjr4252P1e","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytr_Ugz9JDd_xAA8YB_cr054AaABAg.AUfpLhdp6sHAUfuGcji_qf","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"} ]