Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
its your personnally choice to use Ai , so maybe dont use Ai , and that may be a…
ytc_Ugzw9uN7b…
G
I think they will find they way around by making a program that reverses Nightsh…
ytc_Ugw1NEtEo…
G
What is the point of AI, is it just to make a very small number of people extrem…
ytc_UgxzM7eTQ…
G
On the other hand, this ruling isn’t ideal for artists either. If the refusal ha…
ytc_Ugxouhlxr…
G
I may not be a artist but as someone who does Photography I couldn’t stand havin…
ytc_UgwYBIPO-…
G
I believe this guy is so smart, he's an idiot. The moment you remove the "specia…
ytc_UgzzRAhB-…
G
Humanity was created to create AI. There is other AI in other planets that may…
ytc_Ugx93Xqni…
G
the better question is
What can you do to stop every single company on every sin…
ytc_UgxMZwh5C…
Comment
One comment you made caught me by surprise. You said something along the lines of "we will still have large code trees that programmers will need to navigate". I disagree there. I don't think a programmer's job (in the long run) will be to look at the code the AI made and "fix" any issues with it. Instead I think coders will become more like business analysts that understand the requirements and maybe a little bit of code, but don't actually do any coding. Once you go there, you will no longer need a programming language that's readable by humans. Things like OOP, multiple files in a tree, and even assembly itself will become unnessesary and possibly a useless complication. The AI can more easily generate some sort of byte code that is executed directly or in a runtime instead of text in a file.
youtube
AI Jobs
2024-01-17T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwR60PDnaiWnx-ljJx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzTsA7W6T-ATyUL9Kd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy6xYwhRM_1DGv0hqZ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyf1pd0c_Fo8acHMB14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzwQZ-rD_MrTC_QgLt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzj_e--UqYRk3wfj2Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxsm3ZVt_l2uCx57Jt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwDI2OH_Eaz3Fos8U94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxaW59kXLpIWrSkCPx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxD0jkN5QdxTHZP0ep4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]