Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Most of what you’re saying is not about George Orwell at all. It’s not in that universe. We’re talking about leading to an area where warrants are needed then I can understand that. But the rest is just trying to put in something that doesn’t match. Of course the cops don’t have a duty to protect us. First of all that be very difficult. But also that’s just not what it’s about. If the voters want to push for a state to make cops have a duty protect us, then they can do that. But the constitution did not envision that. Cops were a few and far between when it was written. They were there to enforce the law not to be personal bodyguards. So of course the spring court ruled that way and that was the correct ruling. And as far as AI evidence, that’s also just thrown into the George Orwell example to a degree I completely disagree with. There’s really no difference in that versus a witness that has lied on the witness stand. That happens since the beginning of time. We just have to be careful and not over exaggerating things. It’s easy to do. It’s easy to do when people have all stripes do that. And I know I’m probably the dead by voicing my opinion and not just either agreeing or keeping my opinion to myself.
youtube 2026-01-28T04:3…
Coding Result
DimensionValue
Responsibilitygovernment
Reasoningdeontological
Policynone
Emotionresignation
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_UgzdA5G-V4OnYzSe4Ed4AaABAg.ASI4NJCjTgDASIFqv4CGFh","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytr_Ugw6Ixpwt7Qae_rL63B4AaABAg.ASHtgcbjcxZASN2AamC3D_","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytr_Ugw6Ixpwt7Qae_rL63B4AaABAg.ASHtgcbjcxZASNcZplZpYN","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"}, {"id":"ytr_Ugw9wWVPkxK6gWjkMi14AaABAg.ASHqnTc5Rg1AVD7cWPGFOp","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgzgqPq5P2mL4kLJp_x4AaABAg.ASHqFjNNUE3ASW7rbEwA7F","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytr_UgxBuCyfJj8KMpsmVrx4AaABAg.ASHiU5vw7dRASHvx0rcU28","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgwK7LTIFTgzsRPkDcV4AaABAg.ASHcJyxyO4jASHre7xQVBp","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytr_UgwK7LTIFTgzsRPkDcV4AaABAg.ASHcJyxyO4jASHv6ncbKqe","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytr_UgyJVzxvtoQrTTW9Bep4AaABAg.ASH_280U4f1ASPwiGY8bjj","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"resignation"}, {"id":"ytr_UgxupXUwqJOZ0-2c8y14AaABAg.ASHBk6C91KzASHtPOEjzFg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"} ]