Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
My assistant was ***perfect*** for the last decade. Now when I say "Get me directions to the nearest [store]" or "[store] hours", it does the Siri thing and says "Here's what I found on the web" or "I don't understand". Been happening for the last couple of months. The last time it happened before that was when they shoved Gemini down everyone's throats, which *also* couldn't do half the stuff the Assistant could.
reddit Viral AI Reaction 1776946906.0 ♥ 25
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policynone
Emotionunclear
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_ohsx6ds","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"rdc_ohszohe","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"rdc_oht33ck","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"frustration"}, {"id":"rdc_ohvdcgs","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"rdc_oht1t4q","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"} ]