Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I of course have no legal background but it dont seem much different than with p…
ytc_UgyaX_QzR…
G
I never thought of that scenario, what if is really is true and i am sure AI if …
ytc_UgxZk8fKI…
G
This is such B.S I have a genuine issue with mental health and drugs and all AI …
ytc_UgxmltoEP…
G
I asked same question, please view the below conversation.
Hey ChatGPT, you hav…
ytc_Ugx0_Cu9x…
G
Do you guys know how to distinguish real art from AI slop? Ai draws backgrounds …
ytc_UgyAnWNsp…
G
Im careful with calling ai art ugly since that mesh of nonesense was someones st…
ytc_Ugwxz86hj…
G
Practical real learning??? Not just regurgitation of pop quiz facts? Dang actual…
ytc_UgxECoGGq…
G
You need to adjust your moral compass. You owe us a huge apology for what you di…
ytc_UgyCl1-aZ…
Comment
Because the AI isn't as useful as people thought it would be. The good use cases are the same they were in 2023, the scaling argument won't work. This is a failure and it won't succeed. Time to not build more data centers and rethink the whole research branch, because this is the limit of what the LLM path will give you. My suggestion: back to the basics, why do biological systems can do stuff without separating software and hardware and how to process stuff analogically. Our analogic technology has to evolve, and not be constantly turned digital. We are clearly missing something there.
youtube
AI Jobs
2026-02-10T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxncSomFDtuIo0-weV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyH6oQDU2Ozipx71UJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugwh6_v21TXRFwCB-y94AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzXJI6itbYhB6plLvd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxX7y0qJE2DncfTjxd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxMgCYwYJjtzws5vhx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwRY9yOdd8PMBn6VB14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxlOUZizhSSCfnMm1J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzjkYxP5Y_kD0Rmxp94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"mixed"},
{"id":"ytc_Ugymhxyhy7jyF2kCTnZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]