Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Biochemical AI that's the path we need to take not know robots AI biochemical AI will we still have control over the AI cuz it's tied to the biometrics of the human psyche and the human body not including spiritual biochemical AI technology which is a whole another level that's what we're doing out there I don't really believe in robots I believe in drones I'm okay with drones you know little guys that roll around with the little wheels they can hover you know that little circles that little computer eyeball you know it's like big and red and they scan stuff I'm okay with the plane drums too and the new drums that they got that fly around the ones with the four wings those can be processed for a I don't I don't recommend using AI for any type of how can I say delicate things that mankind can do is nothing but an assistant I don't even look to AI to solve problems like that it exists helping solving problems cuz you know I'm more on using 10% of the brain and biochemical spiritual technology and stuff like that now if AI wants to help with that most definitely can definitely help that like doctors police officers fireman hey I can help with all that stuff but as far as those robots go that's a no-go for me
youtube AI Governance 2025-08-13T16:4…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyindustry_self
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgybR0h95dRIW_4GmYZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyxLpPv-OoQeLz2oZR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgySl3w5VV76FpOqq4Z4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxPkm1Ygeuo3nCD9fh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgyYP5a39lRZ5N8WrJ54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwGi-Hi8wxbWg0gJhZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgySpz1bWjQbL_QrFOh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzOYRCybWmyplLm3GJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxDyfW-6wfx-LB8pK94AaABAg","responsibility":"none","reasoning":"virtue","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgzDJ4GGfgwZ1AB84kt4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"outrage"} ]