ALGOL (1960) used semicolons as separators. C made them terminators.
Every language since has had to pick a side, and "optional" always turns out to be the hardest choice.
JavaScript's ASI is what happens when a 10 day language has to live with a quick pragmatic decision for 30 years.
Unfortunately too often those who attempt to design programming languages are ignorant about the history of programming languages, so they are not aware that there have already been programming languages that implemented better solutions for whatever problems they encounter, so their new languages end up being inferior from various points of view to old languages, even if they also incorporate some innovative and desirable features.
The problem discussed in the parent article was already solved correctly by 1965, 60 years ago, in the language CPL, the ancestor of BCPL, which is the ancestor of C.
It is sad that decades later, most programming languages remain worse from this point of view.
In CPL, a command or definition is terminated by a semicolon or by an end of line unless the end of line occurs at a point in the program where the context indicates that a command or definition cannot terminate at that point.
This means that only a minimum number of semicolons are needed, when you want to write multiple statements on a single line, and no special characters are needed to mark that a statement is written on multiple lines.
Before CPL, some languages terminated statements with semicolons, while others with new line characters. CPL was the first to allow either method of termination, and also where new line characters may terminate or not terminate a statement, depending on the context.
Thus CPL combined the advantage of ALGOL-like languages for writing statements on multiple lines with the advantage of FORTRAN-like languages for not requiring semicolons in most cases.
That does require designing the language grammar in a certain way, in particular, there should not be any optional elements in any productions used by statements on the right-hand side (otherwise it's ambiguous whether to terminate the statement or continue to the next line) that can also start a statement. Most languages violate that.
"where the context indicates that a command or definition cannot terminate at that point" is trivially unsuitable for C-like languages, `if`-`else` already violates that, since an `if` can terminate before the `else` by that definition (imagine splitting `else` to a new line).
An actually workable definition would be "unless the end of line occurs at a point where the current expression can not be terminated or the next tokens are not parseable as a separate statement", which is almost the definition of semicolon insertion in JavaScript....
But JavaScript has exactly the problem of the ambiguity since both braces (via object literals) and parentheses (via expressions-as-statements) can start a statement and those are ambiguous with eg. blocks and function calls, respectively.
Indeed, the author of the article even misses on xBase and BASIC, two other famous languages without semicolons.
Honestly I don't mind semicolons; so much that I even type them in Python. (Yes Python has semicolons, they are just optional.) It makes parsing a little bit simpler for both human and automated parsers, it makes statements explicit. There is not really a downside you just type them automatically when you end a statement, like you type Ctrl-S by habit after you stop typing. I find the obsession with semicolon free languages weird.