Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What conditions can we survive in, that a machine could not? Name one. Are you s…
ytr_UgyfnBJ2M…
G
All of these AI Bros, even well intentioned ones like this guy come across as re…
ytc_UgzjjS6iy…
G
I hope that since im not a talented artist and my art style is kinda bad ai won'…
ytc_Ugxs-c5ab…
G
Factory worker jobs are being replaced by robots, taxi drivers, uber drivers, tr…
ytc_UgzpYyHS8…
G
learning from learning?. i get it.... But now stories will never be told as how…
ytc_Ugwsp1orJ…
G
This British fellow seems to have no clue that eventually, that is _eventually_ …
ytc_UgygJ7lJD…
G
If ai takes over they are gonna turn Alex into a chat bot for philosophical ques…
ytc_UgwcKcB-H…
G
How could humans be so stupid to think that AI would do all the things it promis…
ytc_UgzJfYKMu…
Comment
@Floofeen-wp4pn Lisna' peep, I'm sorry, but damn, I've been trying to figure it out for a week now.. First, you visualize an idea, then try to figure out which model and LoRAs to use. You download them, struggle with your limited vocabulary to describe a complex composition with a detailed character, and blindly tweak CFG, steps, resolution, and hires fix—only to end up with either garbage, artifacts, a black screen, or a CUDA error: "Out of memory."
You start over—this time considering LoRA-model compatibility, optimal settings, and scouring forums for the best prompt structure, optimization tweaks, plugin workflows, sampler choices, and ideal resolution/CFG/steps combinations for your specific setup. You learn how to properly configure hires fix without breaking everything. And only after all that effort do you get a decent result—not the trash you can crank out in Fooocus in 30 seconds while convincing yourself that all AI artists are idiots.
youtube
Viral AI Reaction
2025-05-03T17:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugwcq3zEK1ewuqwSXDp4AaABAg.AHSGJ3mFK-4AHfI2Y-8kP-","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytr_Ugwcq3zEK1ewuqwSXDp4AaABAg.AHSGJ3mFK-4AHf_srEQjrV","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwnPGFPwkI4k8k9FTR4AaABAg.AHSAM4yFSLYAHbRF3E6Yct","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytr_UgyZq2oq_dx-IZPYy_d4AaABAg.AHJcEwk6w5dAHN3HuypPND","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgyNQB_H-oHofJXP1994AaABAg.AHHH5AAzprKAHLmDC3FQZK","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyYih-Ez_Hbgz7T3Gl4AaABAg.AHFe6yQYmrjAHblu-4naTd","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytr_Ugx_EuMoZ1AErg2DoLB4AaABAg.AHEojXbzRvfAHGogHGjPoy","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"disapproval"},
{"id":"ytr_UgwXxfGlGyZqTYdupvl4AaABAg.AHEYCf3tyBTAI4RCvNH-Ol","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytr_UgwXuA48CoaFMrfqxWp4AaABAg.AHDVaBFJ-XWAHDsdLhMoVY","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytr_UgzI7d5MiewoeYNkokV4AaABAg.AHCuUS7JpuJAHDdm9QL9qm","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"}
]