Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Worth a shot but I would think humans have to have a solution in the literature available to chatgpt before chatgpt can offer a solution. I loosely follow tinnitus info and as far as I know there is no solution unless something has come up in the last few years. Sometimes it is caused by damage to nerves and we dont fix that yet. Never forget it will provide an answer when no information is available and even provide fake citations to support its false idea. So check it out. Here is a reputable source to compare to https://www.hopkinsmedicine.org/health/conditions-and-diseases/tinnitus Prompt formation could be critical "what are current recommendations" vs "extrapolate from current knowledge to predict possible fixes and the possible fix must be something available now." ...??
reddit AI Harm Incident 1744868428.0 ♥ 24
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[{"id":"rdc_mnje45c","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"rdc_mnkb958","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"rdc_mnny38o","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"rdc_mnjf7fe","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"rdc_mnk4k5l","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"})