Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
And you can also drink Russian rocket fuel.
But why fly it when you can drink i…
rdc_lubprxe
G
ITA TOO ;LATE!! A.I. IS.......#EVIL !!!....…
ytc_UgxHezmYD…
G
is this real, i've never seen a robot move on two legs like this robot. they're …
ytc_Ugx2UwYPh…
G
As a professional driver (trucks); yes it is, theoretically possible to have sel…
ytc_UgyW8fx9J…
G
The safest roles will be the ones that use AI as a tool rather than compete with…
ytr_UgwZ2OZNH…
G
I've been testing Copilot to see if it could actually think. So, far it only sh…
ytr_Ugz2-NFGm…
G
Okay, but with your claude example, that's an LLM. It predicts the next word, it…
ytc_UgztUQgkN…
G
Hugging Face is full of open source LLMs. You don't have to use anything the tec…
ytr_UgwrMqFot…
Comment
Typing a prompt and letting "AI" make "art" based on that is basically like describing what you want to an actual artist and letting them draw it. You're not creating art, it's not YOUR art, it's a commission. The only difference is that the real artist creates something new while the "AI" only regurgitates the stolen data it was trained on. But of course, the artist would want money for their work (because it's work!), while the "AI" can give you endless re-combinations of parts of stolen images for free (or a monthly subscription depending on which one you use, but either way, you don't pay for the "artwork" itself and can make as many as you want). So it steals from artists twice: First, they steal their art for training data, then they steal commissions. And then tech bros pretend like artists are just salty that they're becoming obsolete, but they're not obsolete: Those "AI" models depend on them, they wouldn't exists without the artists whose stolen works they are built on!
youtube
Viral AI Reaction
2025-03-14T18:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgygEnOZZ_mv76Pp45l4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_Ugyn7Rqu9TldERWVB5x4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_Ugzj6iPDxsP0cvjDFMt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgyGQkKpzQ24R33QWBB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},{"id":"ytc_UgyaNc7vbrvW7mODrKV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_Ugyfj0dNCddLyihOeMx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_UgxEpmNxrHSygY6Cw-x4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},{"id":"ytc_Ugya7IccZTnBYOlIqUt4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"fear"},{"id":"ytc_UgxFdlqt7uAfM767z9R4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_UgwcMRzaxBytmLEq-n54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"]}