Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
So y'all want me to believe that bro actually rizzed up an AI chatbot and I cant even get a reply back? Anyway nice video. But, see the point is, why do people fear AI. To understand this we need to consider the fact why in the first place AI is reaching human intelligence. I mean dogs have brains, they aren't smart enough. The part of having brains which make us humans and the only dominating species is our ability to reason. Once AI reaches that, it will have ability to manipulate, deceive and harm at will, just like its predecessors, us humans. And now we come to the part, why do humans fear it. well why do you fear god? I'm an atheist, I don't believe in god but some of u do, why do u believe/fear him? the answer is simple, even when u cant see him u know he's a greater human being who almost has 100% accuracy at what he does 0% error rate, where humans have like 50% error rate. When AI gets the power to think creatively, argue, manipulate, deceive, reason with almost 0% error rate, it will become the new god, it'll become unkillable, undestroyable. u see curiosity kills the cat. I'm sorry to announce we have already became the cat and its just a matter of time Ai lands its final blow. It is almost over for us. You see, when something becomes better than u at everything aka unkillable then it becomes a god and probably what ends us all. So to everyone reading it, it was an honor sharing this earth with you but its about time we are done.
youtube AI Governance 2023-07-08T12:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugx6isM_B8cyb_NYC_B4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxaHawdUVdc4BrGHY94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugx8gEnsFkQbZh68Rnh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxxsP_J7R9-JxME_Ad4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_Ugyac_H4QpRIUrQ2yMN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzIP77QazsKjRfrn-Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugxq73mWLm6h5JOVzxp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_Ugye7dH9Qc8aCjL6-014AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzLkJrOMKG6I2M-i0h4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgwTP6jOZCyqJnBh6qB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"} ]