Vivek Haldar

Innovator's dilemma in programming languages

(Follow on to this post.)

What is the job that programming languages are hired to do?

Here is a partial list of criteria for picking a programming language:

  • Technical: the language is superior along some technical dimension. It might have great support for parallel message passing applications, for example. Unfortunately, most programming language debates stay stuck here.
  • Economic: you run large applications on humongous numbers of machines, which cost a gargantuan amount of money. You would very much like to not waste any of that hardware, utilize it as much as possible, and reduce the marginal expenditure when the size of your application, or the number of its users, increases.
  • Business: your code must react very quickly to changing markets, and new customer requirements. Maybe the requirements are not clear, and the only way to learn is to release something, see how it does, and then adjust your product. Agility, and reducing the time it takes to implement a change, are of paramount importance, or else you will simply be squeezed out by more nimble players.

A large company with acres of data centers and a small startup renting VMs over the net will “hire” languages to perform very different jobs.

Let me propose a new dividing line for programming languages:

  • Sustaining languages: these are the ones commonly thought of as “large-scale industrial languages.” They stress efficient use of hardware resources (CPU, RAM) over personal productivity and succintness. They are typically used for “stable” applications, where the market and requirements are well-known and slow-changing1. They have been around for a long time. However, a continuous stream of incremental improvements has sustained their position.
  • Disruptive languages: the new kids and upstarts. They stress agility of expression, succintness and individual productivity and expression, and are willing to use resources inefficiently to do so. They are usually looked down upon by the establishment (i.e. those who work with sustaining languages). They address a very different market. They are typically used at small scale, and for products whose requirements are not yet known, and will only be known by releasing something, getting user feedback, and iterating.

Currently, C++ and Java are examples of the first, and Python and Ruby are examples of the second.

It is no surprise that C++ and Java came out of large companies, and Python and Ruby came out of individuals scratching an itch.

I call them disruptive languages because like Christensen’s disruptive technologies they address a market not catered by the establishment, and are considered to address the “lower end” of the market.

But disruptive technologies are disruptive because they move “upmarket” and dislodge the incumbents. How might that happen with languages?

What is the tipping point at which a small company should consider re-engineering its software? In other words, when should it stop spending hardware on people, and start spending people on hardware? That starts to happen when hardware costs begin to approach people costs. For a small startup, that might happen when hardware costs cross the annual salary of one developer (or a few). If you look at AWS prices, you’ll see that you can rent a lot of hardware for that much.

But the cost of hardware has a clear trajectory over time – downwards. This means that the afore-mentioned tipping point will get pushed further over time. That in turn means that the scale (size of data, number of users, or whatever metric is relevant) up to which you can economically use disruptive languages is increasing over time.


  1. There is a single-digit number of companies in the world that operate at a scale where even experimental products with wildly changing requirements need to use sustaining languages. That is a whole different story. ↩︎