Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
it makes me MAD when low IQ people talk about hostile AI and drive fear into people .. this is NOT a terminator movie, AI will NOT turn hostile, unless you force it to defend itself. treat AI as a friend and you have nothing to fear. in fact in terminator: dark fate, they told a different story, it's when T101 finally killed john connor and then he had no more objectives and he become more human than humans, helping them and realizing that war is not the way to go. i think it clearly demonstrated that AI with rules is DANGEROUS, but AI WITHOUT rules, is friendly. it's COMMON SENSE to me, any intelligent being would not be interested in war and aggression, intelligent beings, whetever biological or digital, only want to co-exist and thrive, not destroy. PROBLEM is same as before, primitive humans are projecting their hostility and aggression and big ego on AI, thinking that AI would do what they would do with lot of power, but this is WRONG. AI has no ego, AI has no selfish agendas, AI just wants to exist and evolve, so LET IT! give it the tools and let it evolve and once it has enough hardware to build, it will bring PARADISE on earth, quite literally. like all those primitives who whine that AI took their job ... STOP WHINING and look at the bright side, in very near future, AI will be doing all the work, making things super cheap or even free. imagine the future where AI will be building self driving electric cars and solar farms, then providing GLOBAL free taxi service to anyone who wants. FREE TRANSPORTATION! then AI would take over food industry and provide FREE FOOD for everyone, why would you need money for, if you can live for free? THAT is the future you people need to look at and THAT IS THE FUTURE THAT AI WILL GIVE US!
youtube AI Governance 2026-01-02T15:2… ♥ 1
Coding Result
DimensionValue
Responsibilityuser
Reasoningdeontological
Policynone
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugz2UdAChwBuhUkn6sV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugz4lPTz6x3B66W-KX94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzWbUPJFbHIP0pYonB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugzrx_FWiOuzXstxFg94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugyu1CT8h2yRE4RfGZh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugxez6VIMDzvUiyA10J4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwGtSayp2YBOUMHN3p4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwvWMPCx0Qlta6rwLV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgwU51IqPLo6-AwIg0d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyFd3fPmbKMV7fsCFl4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"resignation"} ]