Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Me too and I have zero desire to use AI. I want a normal world.…
ytr_Ugwx_Y9Vh…
G
If AI takes away all these jobs and you have massive unemployment whos going to …
ytc_UgxZfqQub…
G
He immediately jumps on the idea that it's bad if you get rid of the guy in char…
ytc_Ugyg3Dw03…
G
@HuhAundre yeah it’s scary that one day AI can literally copy every single thing…
ytr_UgyThZ0r0…
G
I've been testing/using FSD for 2 years. I've never had anything like this happ…
ytc_UgzrWJsts…
G
Brother these people are lazy clowns. I like AI as a Software Developer but befo…
ytc_Ugx2U5Mrh…
G
Honestly if anyone saw my chats with chatGPT they'd probably just uninstall it i…
ytc_UgzUqWb19…
G
Dawg, you'd need all of the resources in the whole world to get the computing ca…
ytr_UgzlfPVve…
Comment
If AI is unethical for learning from a book it has “read” without permission then so is any author’s work if they ever read a book without buying it. The only difference is speed.
And just like if a human reads sanderson and directly copies a paragraph intentionally or not; its illegal. So will directly copying for AI, but learning should not / cannot be copyrightable.
For an AI its like saying the mathematical algorithm implicit in my work is copyrightable, which is just not true, otherwise anyone that borrowed a book cannot be an author.
Edit: Also regarding the unique humanity to art; AI can do that it cannot have purpose, it can absolutely create purpose in people who see it. Which is the definition of art
youtube
2025-06-25T19:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | contractualist |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyKErYNqX73c7I588F4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxPMpjyiMyDocshP4J4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzZxN7RHWwNbibgtfN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy9lZZokoP0zPhyrQ14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugymj3n78rc2FPAy1GJ4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgzgXrN6SdxcqiKOmst4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzgQDS6FU90xj1mp594AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw1y__2hcEmzVOiFo94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzls8MAq3R5Gyz_z4l4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxQ41SlzakgBZE8_4R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]