Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
everyone in the comments here are cooked, and the big thing here is that these kids admit to it. Its fine if universities don't WANT you to use LLMs... they still will have to fight all the lawsuits for calling it plagiarism though. Yes the detectors "work" on your papers when you decide to self snitch on yourself, but that is because you can't even comprehend how cooked you are. You react to the detectors, but they are just clutching at a difference ( long sentences, CERTAIN sophisticated words like "delve", em dashes and sentences that share the same structure consistently) than what their baseline dataset showed. Remember that there were those before you, those students who had to hire ghost writers (less debatable as cheating, but only because they are human)...... These ghost writers wrote their papers at a high reading level, and stuck to formulaic English structure. Professors used to read them, YES they USED to read them, and it would be obvious the work was not the students or multiple people would receive the same essay, or even the same author. In order to avoid false accusations many honest students began avoiding these habits like the plague. This is why you can "humanize" your paper written by an LLM and trick the detectors, it proves that the professor doesn't give a shit about you or your education. They literally will not read your paper, and neither will a TA.
youtube 2025-02-11T14:0… ♥ 1
Coding Result
DimensionValue
Responsibilityuser
Reasoningmixed
Policyliability
Emotionoutrage
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[{"id":"ytc_UgxiS9y3fHwDUYujtwB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwFb9gx6ujWt14Qc654AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwamM92XS74YwKtL0J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugzlcplw-2dUohZscIl4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgysGx9n0zhp1CAq-e54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwjhaB039LCmOdI3nV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw2bBxk5K49bkdHX-N4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgyquotIukIRYivSGO94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgxpJZCwszMylmvWlA94AaABAg","responsibility":"company","reasoning":"unclear","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgzcqeJZgmQdHbnmTO14AaABAg","responsibility":"user","reasoning":"mixed","policy":"liability","emotion":"outrage"}]