Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
AI should be used as a tool for humans and the working class. I don't want to speak to a robot when I'm buying my groceries. I do not want a robot to resolve a dispute I have at a store where I purchased an item. Robots are extremely expensive and I trust humans more than a robot. I don't want a shot at the Dr's office given to me by a robot. I feel like most of Trump's cabinet heads sound and act like robots, and I wonder what type of chip has been inserted into them actually. Look at Kash Patel's eyes and level of intelligence. Look at Pam Bondi's face and how she never answers any questions, she just repeats information like a recording. When listening to Peter Thiel, looking into his very pale face and how he answers questions is really difficult to follow, is he fully transhuman? I prefer to be truly all human, I want to keep my divinty and the ability to create and feel through my heart. Thiel and Musk and the rest of the technical cyber freaks can have a home full of robots, I don't care, but I prefer pure human. Thank you, Senator Bernie Sanders. I don't trust AI to rule, it's meant to assist us. We aren't all brain damaged or have disorders like some people who created AI.
youtube AI Jobs 2025-10-08T02:4…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policynone
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugx-t8VzGUJpCS8mpzJ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxawOTYMJqmrP4HgzJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugx9faX3B7dR76VbEQl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzBhlLKDuQbJckihCp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwDIV1Zugkcut5n3-x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzEIt3RQIArnMYxJEV4AaABAg","responsibility":"government","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzN1x0yLtRKa_qTUHZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}, {"id":"ytc_UgzPcnPCaF77PHLJYml4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzEISPUVjztXuVwH3N4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgwWMrWMqXSFwihbw1l4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]