When lambdas make it into C++, it’s fair to say that the functional languages camp has “won”. A purely functional language may not have broken into the mainstream, but a large number of features that are functional in spirit are now part of almost every mainstream language. Hard core believers will grumble about macros, or other mechanisms that let you treat code as data to be manipulated, but we have still come a very long way.
This is a battle that had been fought since the 60s, and for all practical purposes that chapter is closed now.
So I wonder, what will programming languages look like in 2050? What are the big problems we should be tackling?
I will put on my prognostication (or make-a-wish) cap, and if that makes me look like a fool down the road, so be it.
Better Systems Programming: functional constructs made it into imperative languages, but lower-level systems programming is still stuck in the 70s. In good ol’ C. Languages like Go are trying to change that. But systems programming (by that I mean stuff like operating systems and virtual machines) is still done by a priesthood with an extremely high barrier to entry and experimentation. The last and only project I ever came across that tries to lower the barrier for experimenting with operating systems was OSKIT from the University of Utah. Imagine the flowers that would bloom if programmers could try out OS ideas with the same investment of effort as writing a Python script.
The current landscape has me pessimistic, for the same reasons Rob Pike laid out in Systems Software Research is Irrelevant: that to even get to the starting line you have to support a huge mass of legacy stuff for compatibility.
One project I like is Microsoft’s Singularity, an attempt at building an OS from the ground up in a safe language.
Treating large bodies of code as raw data for machine learning: Imagine a Clippy for programming. What if your IDE went: “Looks like you’re trying to write a for loop over this array, but you have an off-by-one error. Would you like me to fix it?” Or: “Looking at the signature of this method, here are the most likely methods you will need next to get to the type of the returned value.” That’s your editor understanding the semantics of your language and code base at a deep level.
The fun starts when you combine that with the millions of lines of openly available code and bug reports, feed it to a giant machine learning system, and build up machine knowledge about bug patterns, patterns of good and bad code, idioms and best practices (“Looks like you are trying to filter values from this list. A better way to do that is…”).
Think of it as coding with an entire datacenter behind you to help.
Hardware will eat software from the bottom: Has been a long trend, and I don’t see it stopping. Common low-level software patterns gradually make it into the chip. Network and graphics cards have long been absorbing higher and higher level functionality into silicon. The latest example that’s quite exciting is Intel’s version of transactional memory in hardware, hardware lock elision.