Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@andrewdunbar828 LLM's don't "think". They pattern match. It's just guessing which word is most likely to come next with some constraints. That's why they can't reason anything that hasn't already been reasoned in the training data. They are incapable of coming to new solid conclusions. Occasionally a hallucination might by roll of the dice turn out to be correct by accident. Identifying a novel variation of an established pattern isn't "thinking" either. You could write an algorithm to achieve that with zero intelligence. Remember how bad they were at maths? Now they use formal mathematical notation and got better? It's because they weren't actually doing the maths. They were just guessing what the maths might look like. Now they often use a code interpreter or calculator API to get the maths right. LLM's are great at quickly surfacing popular sentiment from the internet. They cannot determine if that sentiment is correct unless someone on the internet has already done that. That's why I'm not in the least bit scared about it totally replacing humans. It can only replace procedural algorithmic work. That frees up humanity as a resource to do things only humans can do... Like look after the aging population as global population growth stabilizes.
youtube AI Moral Status 2025-11-01T09:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_Ugwq9ZICKsOqZQpO8oR4AaABAg.AOwhLbNpFe0AOwvkCq62nX","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_Ugwq9ZICKsOqZQpO8oR4AaABAg.AOwhLbNpFe0AOx1qy4undB","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgxUWWuaOQB0vWDcKl94AaABAg.AOwewyUsFLPAP3J_gGcjVz","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytr_Ugx7Xtp-LPIt3Su6e8h4AaABAg.AOwcKZM_NvNAOyAvz-p7UO","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"mixed"}, {"id":"ytr_UgwJGNY8IfdyrSHiD6N4AaABAg.AOwZ_bdlSPwAOym7lCdcaz","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgwJGNY8IfdyrSHiD6N4AaABAg.AOwZ_bdlSPwAOz4qyhzb-V","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgwkX1VF-lpsFItOLDt4AaABAg.AOwWAr4UJH3AOxH14xl8nj","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgwkX1VF-lpsFItOLDt4AaABAg.AOwWAr4UJH3APRepLcMq2R","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytr_UgxMwUrLPPKGZc7N7gZ4AaABAg.AOwVkZm-YzoAOwXgNrwO3A","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgxYTqk0c1AMEO-Cn0R4AaABAg.AOwVSD6zj7wAOwp0j6oGpx","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"} ]