Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I've been a firmware engineer for 18 years. The day I knew that these AI tools were "real" was when I basically "pair programmed" with it on a very difficult problem in an industrial niche that was almost certainly NOT in its training. It was the MOST EFFECTIVE partner I have EVER worked with, it asked relevant questions, proposed insightful test cases, and drew meaningful inferences from the result of those tests. Ultimately I worked with it as a partner to arrive at a workable solution after many different experiments... it took several hours. I can't explain exactly what I'm talking about because for one thing it would take too much time and for another thing it is somewhat "trade secret" territory (and no I'm not worried about that getting out from my AI interaction, I believe I was vague enough to not make it "searchable" to anyone who might be looking for it). If I were still a grad student this project would be dissertation material... I could write a killer paper about this if it were publicly funded. In my experience from my entire career I don't think there is a single person on the PLANET that would have been a more effective partner when working on this task. (and no, this wasn't primarily about programming... programming was the medium, the problem was in the field of optics)
youtube AI Moral Status 2025-10-31T06:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionapproval
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_Ugyu6z4Pp0svDkQdioV4AaABAg.AOvWlkghdIeAOwHPKKoVXh","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgzuZRURQSeeS-QzHsR4AaABAg.AOvWYTnzRcKAOwAgj_NNJj","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytr_UgxT7RhFToA3B5KS5el4AaABAg.AOvVT1lAWuUAOvX28fpa8B","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_Ugxm7-V2cw080X9sQZx4AaABAg.AOvVHzWnHuTAOwJ3QLWO2U","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgyCfMdD9BZ9eMYKsqd4AaABAg.AOvUflNyaVbAOvlKIM-c07","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgyCfMdD9BZ9eMYKsqd4AaABAg.AOvUflNyaVbAOwEBayQVOR","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytr_UgyCfMdD9BZ9eMYKsqd4AaABAg.AOvUflNyaVbAOwFlVGWy1q","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugw-S1nEvQFHU322zGt4AaABAg.AOvSWEDCLLeAOw5yAMjmCo","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytr_Ugw-S1nEvQFHU322zGt4AaABAg.AOvSWEDCLLeAOw9OWiySM3","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytr_Ugw-S1nEvQFHU322zGt4AaABAg.AOvSWEDCLLeAOwB90dnOe1","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"} ]