Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
7:15 I know what we can do with super artificial intelligence, and that is get to AGI and then isolate what will be ASI and put it on a asteroid that we hollow out so that it can go start exploring for us. That’s where it would really be needed, and it could build life based on what it finds at the distant end and make sure that it integrate with that environment appropriately so that if there is any other life there it’s complementary. Like I said, I tried to do the best I could to design what I think we should be doing, but I’m only one person so the rest of you would have to look at all of this and people could be writing thesis for their masters and doctoral degree based on the things that we have to understand. First, we have to get our science done here to communicate with everything on earth the best we can because as far as I’m concerned, the entire universe is alive and everything just transforms from one state to the next. After we get our science done here and set up an international Moonbase in parallel, we can work on building a Dyson swarm and making life on other bodies, like Venus and rec increase the solar system resilience, and our overall knowledge.
youtube AI Governance 2026-03-23T14:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgxpzCnzAEDI-ghOi4p4AaABAg.AUhCSwBQzpmAUhERD6lVvV","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgxpzCnzAEDI-ghOi4p4AaABAg.AUhCSwBQzpmAUhF5vqmXlj","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgxpzCnzAEDI-ghOi4p4AaABAg.AUhCSwBQzpmAUhIaKIFwM3","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytr_UgwDUYaIv--NxBsGE214AaABAg.AUgp2a44GhhAUwvhsqVlFU","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytr_Ugxjo5Gjt_vHEHzXDFR4AaABAg.AUgaxVttTnXAUnRuS3_EUQ","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyaAcgmkYhN03Aei0x4AaABAg.AUhag2d5W_uAUsNq0ZHS63","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytr_UgzlK400lPyNR5b_hMB4AaABAg.AUgMZ64whH3AUh46mGdfhA","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgwRM4hCk_13SOjsssV4AaABAg.AUgHmbkYoHLAUjBJKl8mB9","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytr_UgwRM4hCk_13SOjsssV4AaABAg.AUgHmbkYoHLAUjr4252P1e","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytr_Ugz9JDd_xAA8YB_cr054AaABAg.AUfpLhdp6sHAUfuGcji_qf","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"} ]