Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Two hour learning block no wonder why we are falling behind... better off teachi…
ytc_UgzoM292b…
G
Well, the question is, if all the companies will have enough natural resources (…
ytc_UgyO2Bzxx…
G
Art has no solid definition, and trying to gatekeep it is a futile effort.
Drip…
ytc_Ugx3uUwFA…
G
Prediction 15 years from now we will see AI get better but it WON’T bring the ap…
ytc_UgxZs3CIy…
G
Reminds me of those kuva rollers in Warframe...pray that robot wheel doesn't act…
ytc_Ugzwbuh-v…
G
In controlled testing and experimental simulations, some advanced AI models have…
ytc_UgzVZLYnT…
G
that was horrible AI writing though, with some human review and better prompts, …
ytc_UgzTPJrzj…
G
AI is not the real issue now, it's a cover for mass outsourcing to India.…
ytc_Ugyd3Eli6…
Comment
Except that most professors don’t cheat, whereas today a relatively big chunk of students are cheating. The equivalences there are misleading, in my opinion. Academic fraud is there for decades and maybe even centuries, so is cheating in general. Navigating through college with generative AI is new and it’s effects are so deteriorating to the human mind that we can see them in action a couple of years after it’s inception. In short, the average AI-guided student will be so dumb and stupid that he or she won’t even be able to cheat as academic, because even this requires a relatively high level of intelligence, not say even able to be accepted in grad school.
youtube
2025-08-20T14:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxAEZL1YfYq_4ZDaF14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxqlm6iYn-iYLPYtSt4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxH_Et3s2ie8lbiYZ54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyea2M3j0UqaHnCRll4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw1f2-zCM5tP1U1WTh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwLU5RQ9CaFts69rXZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxa6sFa9tmsGEUTB5R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx8wm1woGS1aWP0YhN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzYv0zag3953rStRPR4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwwoPuGHjT95QBpcx14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]