- 5 minutes read

Did you ever wonder why there are such strange operators like !!, === and !== in JavaScript?

Developers coming from other languages, such as Java, often have a hard time to accept they have to write a triple equals character instead of a double one. Actually, the first language I learned was Basic, followed by Pascal, so I still consider the double == weird. But that's just one of the peculiarities you have to accept if you're learning a language following the C tradition.

But of course, all these things have a reason. Hitting SHIFT+0 thrice, hundreds of times a day clearly gets on my nerves. It feels like stuttering. But the reason it had to be introduced was the attempt to save a few keystrokes.

First inflation

Let's start with the first inflation of equality. As I mentioned before, many languages like BASIC use the same syntax as school math to compare things. A single = does the trick. Doing the same in Java sometimes yields an error:

if (firstName = "Jane") { System.out.println("Hi, Jane!"); }

Thing is, in C style languages the single = always is an assignment. So we didn't compare the first name to a certain value. Instead, we assigned it that value. Things get nasty when the value is a boolean. The C tradition also means that the result of an assignment is the assigned value. So the expression following the if keyword is valid, it's just not what the programmer intended to do.

Again, this bug has been introduced to the language family with good intentions. It allows you to chain declarations and assignments like so:

int a = b = c = d = e = 0;

Second inflation

When I learned JavaScript, everybody used the double == notation. It took me quite some time to wrap my head around the new way. Nowadays, almost every JavaScript project has a build chain employing a Linter insisting on the strict assignment operator ===.

In the early days, JavaScript was meant to be a simple language supporting the UI experience of the user. So it made sense to consider every variable a user input.

That's why there's no clear distinction between strings and numerical values. From the user's perspective, everything is a string of keystrokes. The user can't distinguish between numerical input and character input. Programmers make the distinction by enclosing strings with quotes, but no programmer expects their users to enclose the input by quotes. So we programmers just have to guess which data type has been entered by the user.

And that's where JavaScript excels. You can compare numbers to strings. If they look similar, they are similar. It's a bit like duck typing. If it walks like a duck, if it quacks like a duck, we call it a duck.

That's also the reason why JavaScript deals with truth in such a relaxed way. Just imagine how many times front-end programmers have to check the user has entered a value in a form field. In Java, we're used to do this:

if (null != firstName && (!firstName.trim().equals("")) { // deal with the input }

JavaScript also introduced a third option: the value can be undefined. So the if statement gets clumsy. That's why the language allows for a short-hand version:

if (firstName) { // deal with the input }

Server-side programming and large applications

Unfortunately, the JavaScript programming language is very flexible. So the language designers came up with a long list of items that should be considered equal. Every single one of them is logical. But, like so often, the whole is more than the sum of its parts. In our case, it's hard to memorize. So programmers started to make mistakes because of unexpected type conversions. Just think of the famous equality tables, such as this one:

Source: https://dorey.github.io/JavaScript-Equality-Table/unified/, published under a Creative Commons Attribution Share Alike 4.0 license.

Strict equality to the rescue. The triple === compares the operands without any type cast. That's a great relieve for larger programs and server-side programming.

Don't follow the Linter's advice blindly. Even in 2018, there are good reasons to use the double == operator. You just have to be aware it's a major source of errors.

Bang, bang!

Let's return to the caption of this article. The negation character ! is often called "bang". In JavaScript, it's a common best practice to use the double negation. For example, like so:

if (!!firstName) { // deal with the input }

The rationale behind this is that the "bang bang" operator always returns a boolean. The simple comparison if (firstName) works, too, because the string firstName is either truthy or falsy. The first negation reifies this. The result is a real boolean. The second negation is basically the same as the expression without the "bang bang" operator, but this time it's neither truthy nor falsy, but it's a real boolean. It's either true or false. In other words, you can use it in complex expressions without nasty surprises.

As a rule of thumb, it's a good idea to always use the "bang bang" operator unless you really know what you're doing. It's one of those tricks covering a lot of special cases.

Wrapping it up

Isn't it strange how sensible decision like "let's make the UI developer's life simple!" turn out to be a major source of confusions a couple of years later? Lucky, JavaScript developers can usually trust their gut feeling. If things feel right, more often than not they are right. Unless you're trapped with one of the countless pitfalls of the language. So "Bang Bang" and the strict equality operator are your friends.


Comments