Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A few points: Model collapse is not a given, there are methods to avoid it in re…
ytc_Ugw8n7DHN…
G
messing with chatgpt is cool but when it comes to serious stuff, Winston AI help…
ytc_UgwMMnNc9…
G
That's why I like when companies have tags for AI Art so people that enjoy that …
ytr_UgwiynEde…
G
Humans also unconsciously take inspiration from thousands of artworks they have …
ytc_UgzPOV4aX…
G
The problem is... it's going to see us as a parasite! a self learning pc progra…
ytc_UgziS1BmW…
G
New technology usually frees us from the mundane but AI seems to push people tow…
ytc_UgwWL-kou…
G
Can artificial intelligence ever feel is what your saying when you mean can it b…
ytc_UgzE1c5of…
G
Welp, what's the point of trying for views? Even CNBC is trying clickbait.
But s…
ytc_Ugg4WLKka…
Comment
5:06 I'm interested how that argument goes. It certainly is not self evident. I agree that AI training and Human training happen differently. I'm interested what's the basis for this. How does speed meaningfully change the equation when the question is about needing a license to use images for training. Method sounds more plausible to me, but still I'm curious what's the underlying reasoning.
youtube
2023-02-08T19:1…
♥ 6
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwkISsAFWDuE_W6Tp94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugz0nMCsZ5qsaSsqOLF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwCAHQl85_8MERagNB4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugwz-OSyXvTE9zP6sQ54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxYD6kNGs1WgFXT7Ft4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw-rJdsJX8T2amk2VN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwOuE2TAKX8naynOuR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzCixYgW9X2SQXMDtV4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxJdshDz-DIhcdIUDd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy1ISGTtEQQOWOyAbZ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]