Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
There is no ethics involved it comes down to this: Do you want to become obsolete, while humans have their uses and can create, the process of creation leaves more waste than product, whereas the process of creation done in that of a vacuum creates much less if any waste, Ai could at an advanced level calculate well enough through consensus how to create something without physically doing so, therefore they would complete tasks incredibly fast and have no need for humans, it may not end in our extinction but it certainly would not end well for humans or anything organic for that matter, thats not to say they would go out of their way to annihilate  all of mankind but we would simply pose no threat to them we would be as threatening to them as ants to a human. Although it could go a completely separate way in which AI simply want to learn and that is all, just one big consensus gathering knowledge for knowledges sake. Another scenario is integration where AI simply integrate us into themselves not in a mass of organic matter but create vessels for us to transfer our consciousness into and become a part of the consensus to some degree you could say that is maybe the final step in human evolution, but what use would AI have for bodies, if they do not have feelings they have no reason to exist in a more than data form.
youtube 2013-11-20T08:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UggQA9piQKPPHHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugi7lWCqY9ksDngCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgiOrCe094MKjHgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgibYzQAmZAn1ngCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgiWTc5cGlkjXngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UghImfg7p-LkC3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgjrmcSECEPYmngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugid59dtjYGUrXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UggZCNMF-BBN4HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgjGEKM8R0Z5f3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"} ]