Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
lovely Episode, Anunnaki probably had the same talk when they created simple wor…
ytc_Ugz-fntdk…
G
Unfortunately, i can't draw.
I have a thing called total aphantasia - meaning i…
ytc_Ugy1hl-wE…
G
No no no no, this isn't chatgpt's fault. This is absolutely the student's fault.…
ytc_UgzT1G3nt…
G
They gonna become self aware and start a robot cult and kill us all. #skynet…
ytc_UgghxHIY7…
G
AI is a threat, there was this little documentary series called... Terminator. T…
ytc_UgxDJr0vN…
G
A man has been crushed to death by a robot in South Korea after it failed to dif…
ytc_UgyAZnBfA…
G
I've known for a while this was inevitable... Doctors in general are in serious …
ytc_UgwwAcpo5…
G
@carultch Your argument is misplaced about fair use and the nature of AI work. F…
ytr_UgywiWUN2…
Comment
I have a prediction.
I think that coding will adapt to the constraints of AI and the result will be very interesting. For example I imagine small codebases that are very modular, so that the AI could always fully contextualize. Something like functional programming basically constructing a large application out of many small ones.
I am theorizing currently how to optimize a standard Terraform codebase to work seamlessly with AI and unit testing, and that's exactly where my mind goes - small almost or fully independent chunks, automated to work together.
youtube
2025-03-12T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwyDCMH30kGqHFdJRF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyw39MhL7MblnXhoph4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwBQ4CCTJxaIPIGd7B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx2fxktcWSc3zGfvpF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw8URsZJD1xOiKplxJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgypeNZ8b-cvL2Bp3DN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyV4YjUP4E-Ky7TYkt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwn-Wn2cjP3ZeUj5Bd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwztBAjbt3luKApKU94AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwTOdj0llFDtxHvMsJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]