Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Any genre you try to use in simple mode will be transformed in white fucking pop…
ytr_Ugw4PLRMZ…
G
I'm sorry, but you're taking the whole AI thing too seriously. I for one *love* …
ytc_UgwPLibkR…
G
They're arresting people for crimes they havn't done, but an A.I. has calculated…
ytc_Ugzxv3PY-…
G
What a crock of fearmongering shite.
So-called "AI" is still a hypothetical co…
ytc_UgwFbI3ud…
G
You have no idea. Only the father of the Algorithm Muhammad Ibn Musa Al- Khwariz…
ytc_UgwQYaTgk…
G
As ai gets smarter, people will become dumber and internet by extension. Then t…
ytc_UgxQfFEk5…
G
insanely long monologue incoming
So then graphic design cant be considered art …
ytr_Ugwb632mo…
G
Once again another rant/complaint about Capitalism and AI take over
WITHOUT ACT…
ytc_UgyTdgExQ…
Comment
I have a CS professor in my college that I’m in good with and that I like, but last semester broke my moral code.
I don’t use AI to write papers in college, I really only use it as a search engine, but last semester this CS professor decided that all of our assignments he’d get from AI.
We had one paper a week (which wasn’t bad), all on subjects that didn’t even pertain to the class material (which was bad).
Since these assignments became the focus of the entire class, it truly felt like I was learning nothing and/or nothing useful.
Eventually around the 11th week, we found an assignment with instructions that said something along the lines of “attach the assignment here for students.”
This exact same line also happened to be on 3 other assignments.
Whelp that made me start using AI to write my papers.
youtube
2025-08-01T17:5…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugy9_i5J1q-clbMUcrl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx6IZEQUVDKx3mKXvl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgykR-61GwbBaSKeeEN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgySVXVYXsoNNSK57JJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxqsxf2nMax4IjZNjt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy2B82tHnHECCujhPN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugw2Sys4riim0cMcxUx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzCYeDV3pOD1joJg594AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxTrgPKziuOsYI2Dtx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgylYCwYq_JMVazG4154AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"}
]