Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@schmetterling4477 I don't necessarily think it's about where Calculus 2 in specific will get me, it was just an example for the purposes of providing a demonstration of a beneficial use case. I mean, I'll need to learn Calculus 3 as well later, too; but, I'll be taking Calc 2 and Calc 3 during graduate school. I don't necessarily know how saying Calculus 2 won't get me anywhere in mathematics really applies. The good faith is that I'm describing a use case for AI to get some practice with problems and learn where I go wrong in solving them, as a way to get a better handle on and understand with Calculus 2, such as a substitution by parts problem. I also believe, in regards to your explanation of Goedel's incompleteness theorem, if you can study and write down what the proof is, what it's about, and the major strategy of the proof, and actually work to developing an understanding of it, I don't necessarily see a problem in using, say, Google or generative AI in a case like that. but yes, the point I am making is that I referenced Calculus 2 as an example; I'm working on learning it early, I can't just take a Calc 2 course in my undergraduate program, I'd have to pay entirely out of pocket. I'll certainly need Calc 2 and Calc 3 later, when I'm in grad school; but when I'm in grad school I'll be taking those courses directly. Even so, getting a head start and developing familiarity isn't bad.
youtube 2025-08-06T17:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgzN8bMfi4L7U1fTiLp4AaABAg.AL_SOhOg0lTALcguF8vvhV","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgxulZAN6Tzsw__itqx4AaABAg.ALXIB19S8YMAMC_hhWrJOX","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgxulZAN6Tzsw__itqx4AaABAg.ALXIB19S8YMAMC_yPZ40LH","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgxjKDXdRH_zbYp4sNh4AaABAg.ALQb84iV5MXALRWYe4uE_y","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgzfIeLSVCRu2B4nzvl4AaABAg.ALQa0kXwV6hALRWg-oe8QY","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgyWK8j2-kmJXDiCcnp4AaABAg.ALQ_6i9pccGALRX9ZqV5g7","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyWK8j2-kmJXDiCcnp4AaABAg.ALQ_6i9pccGALUsJrvoOgs","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyWK8j2-kmJXDiCcnp4AaABAg.ALQ_6i9pccGALUtip35lqT","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_Ugwu_JVuDjo3cASK8g14AaABAg.ALNJ_FdCeD6ALNqJNQ35by","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytr_UgyrX-ejBFa1_tqh0kV4AaABAg.ALN2uq2wMOOALNrxRV0wtR","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]