Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Sorry, people don't know what they are talking about unless they show me the math on how super intelligent AI's could even come up with warp drives or understanding black holes without doing experiments themselves. Doesn't mean something's super intelligent it has all the answers. Any AI is intelligent as the knowledge we humans have produced, to get more intelligent they also need to do experiments and think creatively like Newton or Einstein which is impossible. Because the AI needs these equations to become creative, without new formulas like E=MC2 or experimenting how is AI going to be "super intelligent". All hubris, people don't know what they are talking about. The "super intelligent AI" will be us, biocompute power where we modify and pass on genius level genes to the next human beings.
youtube AI Governance 2025-06-17T22:3…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyunclear
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgzaIMIHReEDMImKywl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxEHMeks5sr2FIgBhl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugxao9NwHv-iNsnnlr54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwGXRiF1_7tZOnexs14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugz-FSleuiO0i5ih5uV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxqXbkevsi6Dtas9-d4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugwhxvnmg7mXiXfrV0l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugw1SLU2BSm531PgowF4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgxDp0uqlJbhTHnzp554AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyGNZhOxJN4kYmxC3V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"} ]