5 Comments
User's avatar
Andrew Smith's avatar

What are the downsides to not understanding the kernel? I see this happening everywhere, not just in CS and related fields - like nobody knows how to write cursive any more, and pretty soon maybe nobody will remember how to make letters with a pencil. Before that, we sort of phased out memorizing long passages, etc.

With each of these, and over a long enough time frame, the trade off was worth it; but shorter term, I reckon there were some very disruptive moments. I'm guessing the lack of folks understanding how these core languages actually work will lead to some surprises, but also that we get past this awkward, temporary phase where it hurts us more than it helps us.

Expand full comment
Charlie Guo's avatar

Andrew - thank you for asking this question! I started writing an answer, and before I knew it I had essentially written an essay. My initial thoughts are below, but I'll be refining them later into a full piece:

Historically, we've shifted from punch cards to assembly to C to Python, at certainly at each step there were people who felt like we were losing something by working in these higher level languages. So I ask myself a two part question: is "vibe coding" materially different? And if so, does it matter?

I think the first question is easier to answer: yes, it is materially different (in my opinion). One of the things about moving from assembly to C to Python is that all of those systems were still *deterministic*, bugs notwithstanding. Sure, as you moved higher up the ladder of abstraction, you stopped worrying about certain things: memory management, as an example. But the abstractions still had *rules*. You still had to learn and reason about how scoping, or immutability, or loop invariants worked. And while a few of these things are language-specific, the bulk of them are foundational to being a software engineer - it's why senior developers are generally expected to be able to quickly become proficient in similar languages.

As LLMs stand today, I don't think that's true anymore. I don't believe that you *need* to learn programming fundamentals in order to get a working application shipped anymore. You just need to be able to precisely describe what it is you want the application to do. In some ways, that's amazing! You've now dramatically decreased the barrier for people to build their own software, and I fully expect to see a big rise in bespoke and/or throwaway software for individuals (as an extreme example: imagine being able to describe a mobile app, then your phone creating it for you on the fly).

BUT: as LLMs stand today, I don't think that complex, novel software can be built by vibe coding alone. Even as models get better at writing syntactically correct code, there are still many, many ways to introduce subtle logic errors into code that is perfectly correct from a syntax point of view - ever tried to build something that juggles timezones?

There's also the fact that human language is incredibly imprecise - I have argued in the past that the hard part of programming isn't hitting keys on a keyboard, but rather correctly translating business requirements into code. Doubled with the fact that LLMs add a different form of imprecision, and you can see how using LLMs to work with large codebases (as a novice engineer) becomes a nightmare. Because you don't have the language to precisely specific what it is you want changed, each message alters some property of the underlying system which you aren't fully able to grasp.

Which brings us to the second question: does it matter? Yes, for the moment (which you've already alluded to). There are a couple of ways to work around the problem I described above, both from the individual and the industrial level. If/when those things happen, then I'm guessing we'll be fine(ish).

At the individual level, we need to build better software tooling so that we're free to use LLMs without fear of breaking more things. I'm not entirely sure what these should look like, but I know direct manipulation and other age-old UX patterns are likely to play a role. Being able to truly "see," and more importantly, validate, the changes to your codebase (beyond red/green diffs in a text editor) is a big bottleneck in achieving something that's truly "no code."

At the industrial level, if the way we build software changes entirely, then it's also likely a moot point. Today, most teenagers have no concept of a file system, let alone an operating system. They see software in terms of apps, photos, and settings. Maybe we're going to see programmers make a similar shift: do we, as a society, put up with buggier, shoddier software that's more custom and more broadly accessible? As a very rough analogy, perhaps old software is like a hand-made, precision watch. If society is offered a cheap, plastic alternative that gets the job done for now but isn't built to last, will it take it? I don't know for sure, but I wouldn't bet against capitalist economies having access to cheaper goods, even if they are lower quality.

This was probably a lot more than you were bargaining for, but as it turns out I have a lot of opinions on this. Thanks again for asking!

Expand full comment
Andrew Smith's avatar

Great answer, and that's just why I asked the question! I'll get into the weeds with you on this when you publish whatever you end up with, though there's plenty of food for thought here. I kinda come back to the central idea that you sort of have a limited mental capacity on the one hand, so it makes sense not to leave room for other skills and not waste that finite bandwidth/memory we each have; but on the other hand, there is tremendous value in learning to do something tedious. I don't mean the end result, but instead the journey itself- that sort of journey teaches you that you can solve problems when they arise, and here's a way to do it through this one particular lens. That's some baby that might end up thrown out with our bathwater here.

Expand full comment
Charlie Guo's avatar

Ah yes. I think that idea applies with a lot of AI developments, not just the programming-related ones. I have a half-baked essay lying around titled "The effort is the point" which perhaps someday I will end up publishing.

Expand full comment
Andrew Smith's avatar

Oh, and also - literally nobody on earth can make an iPhone from scratch, but we don't like lament that as a society too much, do we? I wonder if information layers end up being similar... but boy is that disconcerting.

Expand full comment