Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The problem with LLM's is that they draw exclusively from past human thinking, w…
ytc_UgzeyIZa0…
G
Then why we don't see kind of issue fir airliners? Tesla design for the general…
ytr_UgzC9rE-B…
G
Ai doesn't have the ability to create characters or creatures that show emotion,…
ytc_UgyMCb9dH…
G
Try it in driving rain, mist or snow or even the dark. Those are perfect conditi…
ytc_UgyKePt34…
G
I am a fan of SciShow, but as an AI engineer myself, this is a TERRIBLE video. N…
ytc_UgwSl8stZ…
G
One real problem in US AI research is that companies are pushing deregulation to…
ytc_UgwV4ndCn…
G
I mean you are using co pilot at least use something like Claude code or Windsur…
ytc_UgyaOyuLd…
G
Nah wait but didnt you do a whole collab reel with an AI art generative software…
ytc_UgyK4B0Fj…
Comment
I worry they will be indeed utterly obsolete within 2-5 years. Tech companies have proven consistently that they have no problem pushing out crap code full of spaghetti and undectected logic holes and business-breaking bugs - which AI can generate cheaper and faster than humans. So what if nobody understands the code vomited forth? Just upsell the customer sheep to the next version ("now with more iArse™"), also written by AI. There will be a brief intermediate period where humans can serve as debugging slaves and test monkeys - aka jobs that humans most hate doing already - but that will soon also be taken over by AI.
We're staring a dark future square in the face where inane, shitty and inscrutable code generated by <20 IQ managerial class drones who will call themselves "AI wizards" or some shit will drive society. Trust the computer. The computer is your friend.
youtube
AI Jobs
2024-01-27T19:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy286heUe3_Az-w9eJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz7F1OVEGVDN2AMCH94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugye24BxSj3X2RNVmlN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzUrfrUrFn2SGqva854AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyA-l7TtBIrtP_kfjJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy5yqV2zaJV7in88FN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzY54zUsqdR5L7126t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzzT6x9AQEmlE4lukl4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgxdyJOwIi-JJUmr7q54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyGuZzcmIhKoL1iglt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]