Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Yuyiyo911 Do ends justify the means? If ending suffering is all you want, then you must simply die. But once we value life it comes with a steady stream of needs and losses. So in order to give life some meaning you must add joy, beauty and pleasure. (I somehow disgressed from the original words. I could indeed include many other words here) I understand that such a position may be missunderstood. Empathy makes your feelings part of my experience. And there is nothing wrong with answering your needs because i share your emotional state. But the goal stays: find joy and beauty in action. That's the way i should act (but not allways do because i too am contaminated by the hammers mind) That's my position I also want to hint at something: Would it be ethical to invent AI. We allready did without us beeing aware of. But the AI does NOT live on silicon. We saw this AI in action during worldwar I and II and in several other occasios. This AI resides distributed in human minds. It regulated human actions following the rules of the AI. We are home to many such AI's which emerged in the last some millennia. AI here means an intelligence not depending on a single brain but optimizing itself to have a large brainpool and network available.
youtube 2015-05-31T09:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UghQpKsAasVXJHgCoAEC.8NCq1VupaeS8NmMcTQjcf4","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytr_UghQpKsAasVXJHgCoAEC.8NCq1VupaeS8NmOTvChvjl","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UghQpKsAasVXJHgCoAEC.8NCq1VupaeS8NoX3gi5lbL","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytr_UgiNbJ86LvBRq3gCoAEC.8HJbuxHDa3S8HmmMrh-V3w","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytr_Ugih4qwDGldsVXgCoAEC.89atYW-GHwu89auenvj00C","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}, {"id":"ytr_UggYqMCl8K3f4ngCoAEC.88YZwB8eYII8BDcXJCCjM5","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytr_Uggcfz830ZCOfXgCoAEC.80JLg45AIG188yhuIBkKdB","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytr_UghlMN6dTppDwXgCoAEC.8-m9xR2_LrD7-JCNw9Hysw","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytr_Ugjo1WOty3AdC3gCoAEC.7-H0Z7-HQ4-7-PBPNh7ouf","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_Ugjo1WOty3AdC3gCoAEC.7-H0Z7-HQ4-7-PCwsdUIBb","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]