Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If y'all scared know this self conscious A.I is impossible(stated by scientists)…
ytr_Ugy6llH7y…
G
Work Smarter,.not Harder.
Trucking no doubt is a hard job but if the free market…
ytc_UgxJcwz3Z…
G
There is one major flaw with hiring AI, with less people getting paid, no one wi…
ytc_Ugwh_v8jP…
G
The main problem with self driving cars isn't even technical or the fact they ar…
ytr_UgznuBOtk…
G
"How do I lubricate my xxx-robot?" Oh my 😂 But a really good question 🤔🤣…
ytc_UgzUubDtK…
G
A.I. reminds me a nuclear bomb - once you know it can exist, you must get there …
ytc_Ugzruq71z…
G
Its Sentigenity to be kind to the AI
A state of being that encompasses awarene…
ytc_Ugz7h0g_s…
G
Similar story to the 2008 crash. Our political leaders should have learned but t…
ytc_UgzOtXKzT…
Comment
As a developer(and by no means an amazing one) myself, this rings true. I am not surprised that people who know jack shit about programming and/or coding are skeptical of this video, but it's true. If you're not able to test and debug, you can't know if something really works or not. That doesn't mean you can't use AI to code, but you should have at least an understanding of how that code works, and how to break it down if you need to. If you don't, then you're going to have problems sooner or later. You don't have to believe this guy lol, you'll find out soon enough if you're vibe coding and don't know anything about coding or the various misc. tasks that go along with it.
youtube
2026-04-23T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugw_6zoByLYwC0ypuC94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},{"id":"ytc_UgwJVY-F0y80QYbJJrR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},{"id":"ytc_UgxVWB47VRbLpDujJmt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgwJtgIzB7UPd-SwErt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgyhLl7LT9KNW3kCoA14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgwfB-KoYZPbRNiHqSV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgzqEH6sB94yzVdWh4l4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},{"id":"ytc_UgwQlFfXIWiOz-thbLF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugy7sTSjRAHMk8TUhoB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},{"id":"ytc_UgwkitdJCeJnyKGo-E14AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"approval"}]