Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Suppose you find a web site that will write stuff for you, so you don't need to do it yourself, in other words it will not be your own work, and you hand in an essay generated for you by the web site. Suppose a colleague gets a friend to do the writing for him, and gets reprimanted, as he has cheated. Now the examiner asks you point blank, did you yourself cheat? You tell him you used a web site. So he asks whether the web site used AI or a human to produce the essay. How would you even know? It may even have been a hybrid effort, where the web site used a human who could rely on assistance from AI. The plain fact is, you did not do the work yourself, you went partying that night when you were supposed to be writing the essay, and what you handed in is not your own work. THAT is what should matter. To decide that your essay was ok if written by an assistive AI would be ridiculous. YOU did NOT write it. The web site might even REFUSE to disclose how they write essays for people, and that refusal could be in their T&Cs. There is a massive difference between writing an essay yourself and using an AI (or human) to do the work for you. The difference is that the whole POINT of writing the essay is to PROVE that you have ASSIMILATED and COMPREHEND the source material of your course. Otherwise, you become the equivalent of one of those companies that bought a respected name but outsourced manufacturing to a low-cost country and became hollowed ouit as a 'box shifter'. And what happens to such companies, and by analogy will happen to you, is that eventually someone figures out the con and says 'why do we need YOU?'.
youtube 2025-05-29T15:5…
Coding Result
DimensionValue
Responsibilityuser
Reasoningdeontological
Policyliability
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxBQbNc838oa21Br3Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzpeW-bZAvC7FJ6YfV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"}, {"id":"ytc_Ugxzmvq8gBrhqpGC1oZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwFougGRJTMTmcFJ_x4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyEeH6t6fkiRa7pj2B4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"disapproval"}, {"id":"ytc_UgzzxvDm4tc1B_a5Ag14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugy-BCUL05nJ9lOXTaN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"}, {"id":"ytc_UgyQgd9o98o0BrIygeZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyIzmvMbpKwjchE37B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwpXIBdEXKCYDvK6ER4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"} ]