Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I wouldn't wanna mess with AI no more our human bodies are natural but AI is a t…
ytc_UghveoVOf…
G
The downsides of AI include bias in algorithms, privacy concerns, job displaceme…
ytc_Ugy4izhOe…
G
Matches what ive learnt over 30 years of trying understand this as a hobby, but …
ytc_UgwR9uUj0…
G
Makes it easier to send NSFW photos….it they get spread/leaked just say it’s a A…
ytc_UgyG-ia7W…
G
We can feel the desperation for him to dominate and manipulate an AI, especially…
ytc_Ugyzu0nuw…
G
there is no man made climate change! CO2 is not a pollutant; it is the molecule…
ytc_UgxCrZnwj…
G
Just imagine cars having been driven automatically caused so much of an accident…
ytc_UgwC_lq9I…
G
I understand your point of view, theft is theft. Hopefully the issue can be reso…
ytc_Ugz3kXrhO…
Comment
I only use AI for:
1. Single-line autocomplete
2. Simple repetitive tasks, like translating/restructuring a large JSON or rewriting a simple template into a different templating language
3. Generating boilerplate
4. Sometimes, doc comments
That's all I've ever found it useful for. The common examples of what coding AI can do like simple algorithms and such are not something that I write for real-world codebases, and for anything specific it writes something between a suboptimal solution and complete nonsense about 99% of the time.
Also, I thought it was obvious that if you don't check the code and just assume your AI wrote it correctly, you will definitely spend more time debugging your code than it would've taken you to write it yourself.
youtube
AI Jobs
2024-06-15T01:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxskXcdCu_2uOcu__J4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyXLBGJSGG5SQRgBuV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyGOGKyGPg5SbkHEFt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwMHlI0WF1DeQMIqZx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxWq-_Wo5JEzmrQ5b94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzgO6JckVaR-nCJ1bp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyFM31JXwiQc8R5eWd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgztffuPX7mtnNb1duB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx14M1hbd7Gz3yZLi14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzEUZBVOGudLiOenCF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]