Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Easy fix dont include race when you imput a data set in an ai,or hear me out don…
ytc_UgwagDaBV…
G
What a fucked up time we're living in virus to kill off the masses making AI the…
ytc_UgwXuKmUl…
G
The "it's actually taking the data" argument is braindead. People take the data …
ytc_UgyvqCRC2…
G
I am highly sceptical that AI will be dangerous, specifically because, in order …
ytc_UgyuWqPZ_…
G
There's no such a thing as AI Art. It's not art. Let's call them AI images, that…
ytc_UgwlFa82v…
G
Large Language Models are not capable of logical thought of any kind. All LLMs w…
ytc_UgwVgPXBH…
G
I have a question regarding AI generated texts, articles or stories: Why the fuc…
ytc_UgyM7tYPf…
G
Remember when they used all the ring cameras in the country to find a lost dog? …
ytc_Ugz0FekmZ…
Comment
"Context engineering" does not work. You can have say 50 debugged library functions and then ask AI to write main.c using those libraries (your context) from your prompt, but it can never write it. Software engineers can write it. Serving AI with context engineering and prompt engineering is just a bad unworkable time and resource wasting idea. AI should remoulded by software engineers as tools and AI should skip the hype and stick being used as tools by software engineers.
youtube
AI Jobs
2026-02-24T03:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxmld8P1setEUiV15B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx_MT8fUWaWP73_BRd4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzE-Hh3JvScxvadqrR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzGWS7YRGW-MYmvZTF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzEDNrQ4kIPrzam6UN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyzUJHbNSDFJlo8l-Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzUyD1mD3rcZ8bqYEJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxD6JUdLcrm8QFBofV4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzphHG2u01eZwccsph4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzkPD8Fiwo9SLQ3eox4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]