The Death Of Dynamic Languages Has Been Greatly Exaggerated

Ruby, Python, and JavaScript getting down. That's Perl on the Clarinet, and Lisp is just fine.

A recent post making rounds makes the claim that dynamic languages are dying. The crux of the argument is that most new languages are statically typed. The people have spoken, and they're crying out for static typing.

This is a mischaracterization of what's happening. Dynamic languages are alive and well. JavaScript, Ruby, and Python remain popular. Not only that, but the longer term trend suggests that the current infatuation with static typing is a cyclical fad. Dynamic languages will likely see a resurgence in the coming years, partly catalyzed by Web Assembly.

Impossibly Difficult

The original post was apparently inspired by the author's personal frustration with dynamic languages, rather than any objective criteria.

Ruby frustrated me at once. Working in Ruby is fine if you’re just adding a feature on top of the pile of features. All you have to do is add some unit tests, make sure the old ones pass, and run away. But anything else is impossibly difficult.

Impossibly difficult, eh? Someone should notify Github. Or the authors of Homebrew. Or maybe Amazon?


He goes on to complain about Clojure because sometimes you have to read the docs. I guess? He doesn't actually mention reading documentation, but it seems like it might help address this problem.

This uncertainty wrecks productivity. Every library function you call, unless you have it memorized, requires hunting down the source or test files for examples on how to use it. Every map you get back requires a println to know what it contains.

Also, DSLs

He goes on to blame Clojure for DSLs, for some reason:

But HTML is the perfect DSL for writing HTML—why replace it for another DSL with your own set of rules and restrictions, and lose the decades of tooling and know-how of every designer on the face of the planet?

I'm assuming, since the title of the article is The End of Dynamic Languages, that the author sees DSLs as some sort of flaw endemic to dynamic languages, not merely a failing of Clojure. This is nonsense.

Type Checking Catches Type Errors

Let's make things a bit easier for plaintiff, since his thesis resonated with thousands of developers. He's struggling a bit to make his point, but we'll happily stipulate that static typing allows the compiler to detect a certain class of bugs known as type errors. If we pass an Integer to a function expecting a String, the compiler can tell us this before we actually run the code and find out the hard way.

That's a significant benefit, since the only other (reliable) way to discover this would be to write tests that exercise the code path that triggers the type error. And, even then, the bug may go undetected or manifest itself some place else entirely. We can do runtime type-checking, so we can at least detect the error when it occurs, but that still requires high-degrees of test coverage.

No Silver Bullets

That said, type errors are only one kind of error. Null pointer exceptions, anyone? Out of memory exceptions? How about logic errors? Security-related errors, like weak encryption? Or intrinsically dynamic errors, like parsing errors? Not to mention that statically typed languages frequently have casting operators that are easily, and frequently, abused.

The Little Green Man In My Head

Developer communities based upon dynamic languages tend to emphasize testing because they have to. But this also results in an increased awareness of the importance of testing, perhaps to a fault. And even to an erroneous belief that thorough tests can catch every error, which, as we've just stipulated, they cannot. However, the inverse may also be true: communities based upon static languages may underestimate the value of testing, relying too heavily on type-checking to find errors.

I know that was the case for me. I cannot speak for people using static languages today, but when I switched from C++ and Java to Ruby and JavaScript, back in 2005 or so, I struggled with the emphasis on testing within the Ruby developer community. For years, the compiler had been helping me fine-tune my code. Relying on tests instead felt…dangerous.

You're Not Going Crazy

At the same time, writing code without worrying about the compiler nagging at me was liberating. I could explore ideas freely, without concerning myself with whether I was able to realize them in platonic form. I preferred exploring and then adding tests once a design gelled. The total investment of time was similar, but I was ultimately more productive using dynamic languages. The exploratory phase was crucial for allowing me to make mistakes.

In statically typed languages, type-checking requires that your interfaces be both complete and well-defined. In some domains, this is fine, since we already understand the domain well. But most of the time, and I suspect for most programmers, we are working either in unfamiliar domains or in familiar domains with unfamiliar variations. This is the nature of innovation. Developers who can specify complete and well-defined interfaces, under such circumstances, are rare. And I am not one of them. Being able to rapidly explore variations in a design is often more valuable in practice than type-checking.

Type Analysis > Type Checking

Even better, we can have our cake and eat it, too:

You might be wondering what kind of unholy sacrifices we’d have to make when designing a dynamic language that is amenable to type analysis. The answer, I believe, is that we wouldn’t have to sacrifice much.

Type analysis is possible in most dynamic languages. In combination with type annotation, developers could rapidly explore different design approaches, and then augment their code with type annotations as needed when they were ready to “harden” their code.

The popularity of Ruby and Python is arguably a response to the frustration with statically typed langauges like C++ and Java. Ironically, I was defending static typing back in the mid-90s. Shut up. And the current fascination with typed languages is arguably just the pendulum swinging back in the other direction. Eventually, developers will tire of their tools nagging at them when they're just trying to understand the problem they're trying to solve. And then pendulum will swing back towards dynamic languages.

Only this time, I think we will see the emergence of dynamic languages with type analysis and annotation capabilities. And the laboratory for these powerful new languages will come from perhaps an unexpected place…the browser.

A Bit Of A Stretch

Web Assembly is making it possible to use the browser as a compile target. Languages like CoffeeScript and TypeScript already do this, but Web Assembly takes this basic premise to its logical conclusion. Compiling into JavaScript means language designers are working with one hand tied behind their back. Whereas the intention is that Web Assembly grow into a platform that can support arbitrary languages.

[Some features are] a bit of a stretch for JavaScript, and having the ability down the road to let [Web Assembly] do the exotic things that C++ wants and not needing to figure out ways to put those less-safe facilities into JavaScript is a great relief.

The first round of languages that compile into Web Assembly will be static languages, like C++, because that's what Web Assembly will support. But the roadmap for Web Assembly includes the necessary features for languages like Ruby or Haskell and languages we have not yet dreamt of.

As Shakespeare's Hamlet explains:

There are more things in heaven and earth, Horatio,
Than are dreamt of in your philosophy.