There is one very BIG thing that Cobol pioneered: the requirement that not only the programs, but also the data, must be portable across machines. At a time when machines used different character codes, let alone different numeric formats, Cobol was designed to vastly reduce (though it did not completely eliminate) portability woes.
We take this for granted now, but at the time it was revolutionary. In part, we've done things like mandating Unicode and IEEE 754, but nowadays most of our languages also encourage portability. We think very little of moving an application from Windows on x86_64 to Linux on ARMv8 (apart from the GUI mess), but back when Cobol was being created, you normally threw your programs away (“reprogramming”) when you went to a new machine.
I haven't used Cobol in anger in 50 years (40 years since I even taught it), but for that emphasis on portability, I am very grateful.
Another fascinating aspect of COBOL is that it's the one programming language that actively rejected ALGOL influence. There are no functions or procedures, no concept of local variables. In general, no feature for abstraction at all, other than labelled statements. Block-structured control flow (conditionals and loops) was only added in the late 80s.
Contemporary COBOL (the most recent ISO standard is from 2023) has all those things and more.
Rather than rejecting such features, COBOL was just slower to adopt them-because conservatism and inertia and its use in legacy systems. But there are 20+ year old COBOL compilers that support full OO (classes, methods, inheritance, etc)
Instead now, you throw everything away when moving to a new language ecosystem. Would love to see parts of languages become aligned in the same manner that CPUs did, so some constructs become portable and compatible between languages.
Great point. But some newer languages do keep compatibility, with Java (Scala, Groovy, Kotlin, Clojure) and .Net (C#, F#, Visual Basic, Powershell) “platforms” being an example, but also with system languages that normally have some simple (no binding required) ABI compatibility with C , like D, Zig and Nim I think.
The newest attempt seems to be revolving around WASM, which should make language interoperability across many languages possible if they finally get the Components Model (I think that’s what they are calling it) ready.
Today, with several of those languages, we really don't care from a code point of view if the deployment target is Linux or Windows, we know it will work the same. That's an achievement.
And many of them can target WASM now too.
the other big cobol feature is high precision (i.e. many digest) fixed point arithmetic. not loosing pennies on large sums, and additionally with well defined arithmetics, portably so as you point out, is a killer feature in finance.
you need special custom numerical types to come even close in, say, java or C++ or any other language.
>the other big cobol feature is high precision (i.e. many digest) fixed point arithmetic. not loosing pennies on large sums, and additionally with well defined arithmetics, portably so as you point out, is a killer feature in finance.
I guess you mean:
>digest -> digits
>loosing -> losing
Is that the same as BCD? Binary Coded Decimal. IIRC, Turbo Pascal had that as an option, or maybe I am thinking of something else, sorry, it's many years ago.
There are some regulations in bond pricing or international banking or stuff like that that require over 25 decimal places. IIRC, the best COBOL or whatever could do on the IBM 360's was 15 digits. The smaller, cheaper, and older 1401 business machines didn't have any limit. Of course, for nerdy financial applications, compound interest and discounting of future money would require exponentiation, which was damn-near tragic on all those old machines. So was trying to add or subtract 2 numbers that were using the maximum number of digits, but with a different number of decimal places, or trying to multiply or divide numbers that each used the maximum number decimal places, with the decimal point in various positions, and it was suicide-adjacent to try to evaluate any expression that included multiple max precision numbers in which both multiplication and division each happened at least twice.
> There are some regulations in bond pricing or international banking or stuff like that that require over 25 decimal places.
Sounds interesting. Is there anywhere you know I can read about it, or is there something specific I can search for? All results I'm getting are unrelated.
Binary Coded Decimal is something else.
1100 in “regular” binary is 12 in decimal.
0001 0010 in BCD is 12 in decimal.
ie: bcd is an encoding.
High precision numbers are more akin to the decimal data type in SQL or maybe bignum in some popular languages. It is different from (say) float in that you are not losing information in the least significant digits.
You could represent high precision numbers in BCD or regular binary… or little endian binary… or trinary, I suppose.
Is Python indentation at some level traced back to Cobol?
I would guess not. indentation in python serves a very different purpose to the mandatory indentation as found in early cobol/fortran.
I am not really an expert but here is my best shot at explaining it based on a 5 minute web search.
Cobol/fortran were designed to run off punch cards, specific columns of the punch card were reserved to specific tasks, things like the sequence number, if the line is a comment, continuation lines.
https://en.wikipedia.org/wiki/COBOL#Code_format
https://web.ics.purdue.edu/~cs154/lectures/lecture024.htm
In python the indentation is a sort of enforced code style guide(sort of like go's refusal to let you load unused modules), by making the indentation you would do normally part of the block syntax, all python programs have to have high quality indentation. whether that author wants to or not.
I don't know much about COBOL, but I did code quite a bit in Fortran. In Fortran, the first five columns were for an optional line number, so these would mostly be blank. The sixth column was a flag that indicated that the line was continued from the one before, this allowed for multiline statements, by default a carriage return was a statement terminator. Like you said, all this came from punch cards.
Columns seven through 72 were for your code
Interestingly, punch cards and early terminals in the 80-132 columns range reached the limits of readable line length and early programming languages were obviously pushed to the limits of human comprehension, making the shape of text in old and new programming languages strikingly consistent (e.g. up to 4 or 5 levels of 4 characters of indentation is normal).
I've heard it was from Haskell?
Okay, I'll bite. ML did not mostly die, it morphed into two main dialects, SML and OCaml. OCaml is still going strong, and it's debatable whether SML is mostly dead.
My main beef, however, is that the last sentence in the section seems to suggest that the birth of Haskell killed SML on the vine because suddenly everybody only wanted pure, lazy FP. That's just wrong. The reality is that these two branches of Functional Programming (strict/impure and lazy/pure) have continued to evolve together to the present day.
Yeah, I saw that and was tempted to say the same thing. Ocaml is alive and well, and SML still is in active use. Ocaml has a relatively broad application space, whereas SML is more or less limited to the world of theorem provers and related tools (e.g., PolyML is used to build things like HOL4, and CakeML is a pretty active project in the verified compilers space that targets SML and is built atop HOL4). SML is likely regarded as irrelevant to industrial programmers (especially those in the communities that frequent HN), but it's still alive. F# is still alive and kicking too, and that's more or less an Ocaml variant.
I'd argue Rust is modern ML in many ways, it just uses a c-like syntax. It's really the non-pure Haskell alternative.
Isn't F# ML-influenced?
F# used to be OCaml.NET AFAIK
Not quite. They explicitly drew a lot of inspiration from OCaml, but they never intended it to be an OCaml compiler for .NET.
F# was – from the start – a functional language designed specifically for the .NET Framework Common Language Runtime. Whenever OCaml and CLR diverged in how they did things, they went the CLR route.
(See e.g. https://entropicthoughts.com/dotnet-on-non-windows-platforms... for more information, or the Don Syme history of F#.)
I have a colleague who is very good with C# and F#. We use a lot of C# for work, and for him F# is just a hobby (he has a math background). Because Rust is basically an ML dressed up to look like a semicolon language, and I know I grokked Rust almost immediately with an academic background in SML/NJ and decades of experience writing C, my guess is that this colleague would pick up Rust very easily, but I can't confirm that.
Every Xmas when people are picking up new languages for Advent of Code (which like me he does most years) I wonder if he'll pick Rust and go "Oh, that's nice" - I was going to write a relatively contemporary spoiler here but instead let's say - it's like watching a Outer Wilds player staring at the starting night sky for their fifth or hundredth time wondering if they're about to say "Oh! Why is that different each time?". No. Maybe next time?
So he didn't immediately take to Rust? What was his feedback? I like F# and have also wanted to dive into Rust.
He's never tried it.
Sorry it may not have been clear, I was comparing the experience of knowing he might love Rust (or not) but not knowing if he'll decide to learn it - against the experience of watching unspoiled people playing a discovery game such as Outer Wilds where you know what they don't know yet and so you're excited to watch them discover it. I dunno that's maybe not an experience most people have.
If you either enjoy learning new languages or have a purpose for which Rust might be applicable I encourage you to try it. As you're an F# user it won't be as revelatory as it is for someone with say only C as a background, but on the other hand if you have no bare metal experience it might also be a revelation how fast you can go without giving up many of the nice things you're used to.
If you're a polyglot you probably won't encounter much that's new to you because Rust never set out to be anything unprecedented, I've heard it described as an "industrialization of known best practice" and it's ten years since Rust 1.0 drew a line in the sand.
I've always wondered what it would be like to "live" in ALGOL 68 for a while, because of its ambition and idiosyncrasy. With a modern perspective, what would I be surprised to actually have, and what would I be surprised to miss?
(Apart from the wild terminology. File modes are called "moods", and coincidentally, ALGOL 68's three standard "channels" (i.e. files) are "stand in", "stand out", and "stand back" -- I kid you not, close enough to 'stdin', 'stdout'.)
Ah, APL. I remember a network synthesis EE course in the 70s where we had an assignment to implement matrix exponentiation. I had just finished a matrix algebra class and learned APL in the process, so my implementation was 2 lines of APL. Those where the days...
How is it to read and write? It always looked to me like something that made Perl look verbose and clear.
> How is it to read and write?
Fairly straight-forward once you've learnt the character set.
See here for details: https://aplwiki.com/wiki/Typing_glyphs
You should come back. APL has considerably modernized in the intervening 50 years, and with data parallelism gaining ever more market share, the timing is perhaps ripe...
>An accurate analysis of the fall of Pascal would be longer than the rest of this essay.
I put the blame solely on the management of Borland. They had the world leading language, and went off onto C++ and search of "Enterprise" instead of just riding the wave.
When Anders gave the world C#, I knew it was game over for Pascal, and also Windows native code. We'd all have to get used to waiting for compiles again.
Agreed. And it was partly how they ignored or snubbed two+ generations of rising developers, hobbyists, etc.
No kid or hobbyist or person just learning was spending $1400+ on a compiler. Especially as the number of open-source languages and tools were increasing rapidly by the day, and Community Editions of professional tools were being released.
Sure they were going for the Enterprise market money, but people there buy based on what they're familiar with and can easily hire lots of people who are familiar to work with it.
Last I looked they do have a community edition of Delphi now, but that was slamming the barn door long after the horses had all ran far away and the barn had mostly collapsed.
My recollection, contrary to TFA, is that development of PL/I or PL1 (both names were used) started in 1964 or a bit earlier with the intention of coming up with a replacement for Fortran. I think some referred to it as Fortran V or Fortran VI. IBM introduced PL1 pretty quickly after the it rolled out 360 mainframes, maybe because it might help sell more megabuck hardware, but there were two different compilers that compiled two different versions of the language (a fork that remained a wretched stumbling block for at least 20 or 25 years) into the one for the big mainframes (model 50 or bigger) and one for the economical mid-line mainframes, and none for the model 20. In that period, Gerald Weinberg published some good stuff about PL1, including a strong warning against concurrent processing. There was also a PL1 on Multics that tried to be a little more suited to academia than the IBM versions. In the middle 1980's there was a PL1 subset that ran on PC's. It couldn't do anything that Turbo Pascal couldn't do, but it did it much slower.
The point of PL/I was to unify the userbases of both FORTRAN and COBOL, thus tending to "replace" both languages. There was some influence from PL/I to the CPL language, which in turn led to BCPL, B and C.
> Of the four mother languages, ALGOL is the most "dead"; Everybody still knows about LISP, COBOL still powers tons of legacy systems, and most scientific packages still have some FORTRAN.
I've heard of enough Cobol and Fortran jobs existing, and Lisp continues to exist in some form or other, but Algol really does seem dead. I remember someone telling me about an Algol codebase that was decommissioned in 2005 and that seemed like a very late death for an Algol codebase.
> but Algol really does seem dead
Unisys still actively maintains their MCP mainframe operating system, which is written in an Algol superset (ESPOL/NEWP), and comes with a maintained Algol compiler - https://public.support.unisys.com/aseries/docs/ClearPath-MCP... - and they continue to add new features to Algol (even if minor)
So, no, Algol isn’t dead. Getting close but still not quite there. There are better candidates for dead languages than Algol, e.g. HAL/S (programming language used for the Space Shuttle flight software)
There are some legacy programs in DoD that are developed in a variant of Algol 58 called JOVIAL that are still supported and edited despite over 40 years now of mandate to update to something newer (Ada). Also used in some other places, like UK Air Traffic Control at least until 2016 (that was the last date I've seen for replacement plan)
Algol is in the funny position of being both very dead, yet not dead at all given its legacy. I suspect that's sort of inevitable: if all of a language's spirit is cannibalised by newer languages, it will be hard to argue for using the old language and it dies.
(In contrast, Lisp retains some unique ideas that have not been adopted by other languages, so it survives by a slim margin.)
There are several well maintained Common Lisp compilers, some of which are paid for and can sustain whole businesses, Clojure is fairly big as niche languages go, and Guile Scheme is used across GNU projects. There’s also some usage of Racket and Scheme in education, and I believe Janet is having success in gaming. So I wouldn’t say Lisp survives by just a slim margin.
Correct. That was bad wording on my part. I meant it in the sense of "it survives at a scale proportional to the fraction of it that is unique".
It used to be that things like GC, REPL, flexible syntax, the cond form etc. made Lisp unique, but these days the list is down mainly to homoiconicity (and an amazing hacker culture).
the image-based REPL, including the interactive debugger, is still unique (matched maybe by Smalltalk), CLOS and the condition system too… pair them with stability, fast implementations, instantaneous compile-time errors and warnings a keystroke away, a Haskell-like type system on top of the language (Coalton), macros… and the whole is still pretty unique.
In the finance world COBOL is still very much not dead. It's not as dominant as it once was, but core mainframe code is very often still COBOL.
Replacing that is a very hard problem as thousands and thoudands of (abused and overloaded) integrations are layered in serveral strata around it relying each night on exact behaviour in the core, warts and all.
As for Smalltalk, I know at least one company around hete still running some legacy, but still maintained afaik, code on it ( https://mediagenix.tv ).
I used to work in a bank IT department 13 years ago and all their systems still ran on Java 1.4.2 and COBOL procedures, at the time IBM was basically forcing them to migrate to Java 1.6 by the end of the year because the new LTS version didn't support that Java version anymore.
"Significance: In terms of syntax and semantics we don’t see much of COBOL in modern computing."
Would I be wrong in saying that SQL has what feels to me to be a very cobaly syntax. By which I mean, I know it is not directly related to cobal, But someone definitely looked at cobal's clunky attempt at natural language and said "that, I want that for my query language"
I agree completely. They were from the same era (COBOL is a few years older), and they do have that dweeby, earnest, natural language influence.
Algol-68 gets a bum rap here - it brought us 'struct', 'union' (and tagged unions) a universal type system, operator declarations, a standard library, and a bunch more - Wirth worked on the original design committee, Pascal reads like someone implementing the easy parts of Algol-68.
The things it got wrong were mostly in it having a rigorous mathematical definition (syntax and semantics) that was almost unreadable by humans ... and the use of 2 sets of character sets (this was in the days of cards) rather than using reserved words
structs were first introduced in Algol 58 variant called JOVIAL, AFAIK
When I saw the Smalltalk paragraph I could not help thinking Rust will be to Haskell what Java has been to Smalltalk.
Serious question: Is Ada dead? I actually had to google Ada, and then "Ada language" to find out. It's not dead, and it has a niche.
When I was in grad school in the late 70s, there was a major competition to design a DoD-mandated language, to be used in all DoD projects. Safety and efficiency were major concerns, and the sponsors wanted to avoid the proliferation of languages that existed at the time.
Four (I think) languages were defined by different teams, DoD evaluated them, and a winner was chosen. It was a big thing in the PL community for a while. And then it wasn't. My impression was that it lost to C. Ada provided much better safety (memory overruns were probably impossible or close to it). It would be interesting to read a history of why Ada never took off the way that C did.
I don't think the Ada story is particularly interesting:
(1) It was very expensive to licence at a time where C was virtually free.
(2) It was a complicated language at a time where C was (superficially) simple. This made it harder to port to other platforms, harder to learn initially, etc.
(3) All major operating systems for the PC and Mac happened to be written in C.
Ada had virtually nothing going for it except being an amazingly well-designed language. But excellence is not sufficient for adoption, as we have seen repeatedly throughout history.
Today? Virtually nothing stops you from using Ada. For lower level code, it's hands-down my favourite. Picking up Ada taught me a lot about programming, despite my experience with many other languages. There's something about its design that just clarifies concepts.
(3) wasn't relevant (or even really true) in 1977-1983 when Ada was being standardized.
MS-DOS was mostly x86 assembly, Classic MacOS was a mix of 68k assembly and Pascal, CP/M was written in PL/M, UCSD P-System was Pascal, and this leaves out all of the OS Options for the Apple II - none of which were written in C. I'm hard pressed to identify a PC OS from that time period that was written in C, other than something Unix derived (and even sometimes the unix derived things were not C, Domain/OS for example was in Pascal).
If we leave the PC space, it gets even less true - TOPS10/20 NotC, RSX-11 NotC, VMS also NotC - and I can keep going from there - the only OS from the time period that I can point at from that time period that was C is UNIX.
I'd actually argue that C/C++ were not enshrined as the defacto systems programming languages until the early-90's - by that time Ada had lost for reasons (1) and (2) that you noted.
> All major operating systems for the PC and Mac happened to be written in C.
They were written in “not Ada”; the original OS for the Mac was written in assembly and Pascal.
> Virtually nothing stops you from using Ada
What would you recommend for getting started with it? Looks like there's GNAT and then also GNAT Pro and then the whole SPARK subset, which one would be best for learning and toying around?
GNAT. Upgrade to gprbuild when you start to find gnatmake limiting.
SPARK is best considered a separate language. It gives up some of the things that make Ada great in exchange for other guarantees that I'm sure are useful in extreme cases, but not for playing around.
No. Just google for NVIDIA and Adacore to see how Ada is quite alive in NVIDIA land. Ada is quite a nice language that more or less anticipated a lot of the current trends in languages that the safe languages like Rust and friends are following. Spark is quite a cool piece of work too. I think the perception of old-ness is the biggest obstacle for Ada.
Ada isn't dead and it's superior to Rust in many ways, but it is less trendy. adacore.com is the main compiler developer (they do GNAT). adahome.com is an older site with a lot of links.
How does Ada solve dynamic allocation?
Assuming "solve" is meant loosely: much like in C++, with RAII-style resource management. The Ada standard library has what are called _controlled types_ which come with three methods: Initialize, Adjust, and Finalize. The Finalize method, for example is automatically called when a value of a controlled type goes out of scope. It can do things like deallocate dynamically allocated memory.
That said, Ada also has features that make C-style dynamic allocation less common. Ada does not have pointers but access types, and these are scoped like anything else. That means references cannot leak, and it is safer to allocate things statically, or in memory pools.
It kind of doesn't at the moment. That's an area where Rust is ahead. They are working on a borrow-checker-like thing for Ada. But the archetypal Ada program allocates at initialization and not after that. That way it can't die from malloc failing, once it is past initialization.
How can COBOL be a "dead" or "mostly dead" language if it still handles over 70% of global business transactions (with ~800 billion lines of code and still growing). See e.g. https://markets.businessinsider.com/news/stocks/cobol-market....
BASIC is the scripting language used by Microsoft Office. Saying that it powers millions of businesses is probably not an exaggeration.
Pascal, particularly the Delphi/Object Pascal flavor, is also still in widespread use today.
Also Smalltalk is still in wide use; ML is also used; there are even many PL/I applications in use today and IBM continues to give support.
I don't know, I heard somewhere that even the C language is in wide use, still ... ;)
No one's starting new projects in COBOL.
One of the most significant new COBOL projects in 2025 was the integration of a new COBOL front-end into the GNU Compiler Collection. There are indeed quite many new projects being started in COBOL, though they primarily focus on modernization and integration with contemporary technologies rather than traditional greenfield development. Also not forget some cloud providers now offer "COBOL as a service" (see e.g. https://docs.aws.amazon.com/m2/latest/userguide/what-is-m2.h...).
By "new COBOL projects" I mean green-field development of entirely new projects written in that language - not the continued development of existing COBOL codebases, or development of tools which interact with COBOL code.
As an aside, the article you linked to is pretty obvious AI slop, even aside from the image ("blockchin infarsucture" and all). Some of the details, like claims that MIT is offering COBOL programming classes or that banks are using COBOL to automatically process blockchain loan agreements, appear to be entirely fabricated.
> There are indeed quite many new projects being started in COBOL
No.
You have to put this relative to projects started in other languages, at which points new projects started in COBOL is even less than a rounding error, it probably wouldn't result in anything other than 0 with a float.
The claim was "No one's starting new projects in COBOL."
And everyone of good faith understood what the claim actually was.
And everyone with relevant fintech project experience knows that new projects on the existing core banking systems are started all the time and that COBOL continues to be a relevant language (whether we like it or not).
Maybe their definition uses recent popularity or how many new projects are started with it. Under that definition, I think it's pretty safe to call it "dead".
If you redefine language, anything is possible.
Yes. "Dead" normally means "to be devoid of life," but it's often extended to metaphorically cover things like computer languages.
edit: for ancient Greek to become a dead language, will we be required to burn all of the books that were written in it, or can we just settle for not writing any new ones?
For ancient Greek all you need is no one speaking it (using it in real life)
Same with a programming language - is no one is wiring code in it, it's dead
Modula-3 should be on that list as well. Unfortunately pretty dead (compiler support is rather abysmal), though pretty influential. Wikipedia lists a couple of languages that it influenced, I think it should also include Go (though Go is allegedly influenced by Modula-2, according to its wikipedia article)
What other languages have been influenced by Go?
I think they meant Go should be in the list of languages influenced by … but one language influenced by Go I believe is Odin.
The (literal) first and foremost ASCII descendant of APL was MATLAB.
I feel that the article should have made this a lot more clear - as so many people code along the APL -> Matlab / R (via S) -> NumPy family tree.
R/S is also heavily influenced by Lisp. Haven’t written it in 10 years, but AFAIR it even has proper macros where argument expressions are passed without evaluation.
Technically those aren’t macros; they’re what the Lisp world calls ‘fexprs’.
I was almost sure that Prolog would be on the list, but apparently not.
Because it's dead or because it's influential?
COBOL - “mostly dead” but still somehow the backbone of the global financial system
Wow, that was a trip down memory lane! I have used six of those languages: BASIC, APL, COBOL, Pascal, Algol-W (a derivative of Algol 60), PL/1. Mostly in school. My first dollars earned in software (brief consulting gig in grad school) had me debug a PL/1 program for a bank.
For some reason I remember an odd feature of PL/1: Areas and offsets. If I am remembering correctly, you could allocate structures in an area and reference them by offset within that area. That stuck in my mind for some reason, but I never found a reason to use it. It struck me as a neat way to persist pointer-based data structures. And I don't remember seeing the idea in other languages.
Maybe the reason it stayed with me is that I worked on Object Design's ObjectStore. We had a much more elegant and powerful way of persisting pointer-based structures, but an area/offset idea could have given users some of the capabilities we provided right in the language.
Area-based allocation seems to be quite popular recently in game development (under the name "arena allocator"). Saw some talks referencing the concept in the "Better Software" (gamedev) conference on the weekend [1]
Ada supports storage pools as well. In the right circumstances, it's a safer way to deal with dynamic allocation.
I believe it also starts to creep into things like C#.
>> An area is a region in which space for based variables
can be allocated. Areas can be cleared of their allocations
in a single operation, thus allowing for wholesale freeing.
Moreover, areas can be moved from one place to another by
means of assignment to area variables, or through input-output
operations.
>> Based variables are useful in creating linked data struc
tures, and also have applications in record inputoutput. A
based variable does not have any storage of its own; instead,
the declaration acts as a template and describes a generation
of storage.
http://www.iron-spring.com/abrahams.pdf p. 19, 74*shrug_emoji*
Seeing Smalltalk on these lists and not Self always seems... lacking. Besides its direct influence on Smalltalk, and its impact on JIT research, its prototype-based object system lead to Javascript's object model as well.
Self was influenced by Smalltalk, not the other way around. Smalltalk was developed in the 1970s. Self in the 1980s.
Thanks for the correction.
Pascal isn't dead. At least here in Brazil it's widespread. I've heard in Russia and China too.
Freepascal [1] is up and running, targeting a lot of platforms: x86-16, x86-32, AMD64, RISC-V (32/64), ARM, AArch64, AVR, JVM, Javascript...
... and operating systems: Windows, Linux, MacOS, iOS, web and others.
I don't know an easier way to build multiplatform desktop applications other then Freepascal/Lazarus. I mean, real desktop apps, not that Electron bullsh*.
I still sometimes use BASIC and Pascal (both in DOS), although they are not the programming languages I mostly use.
Interesting read, and would have been good to see the author’s definition of ‘mostly dead’. Some are still used widely in niche areas like COBOL for banking. If a language itself isn’t receiving any updates nor are new packages being developed by users, is it mostly dead?
In any case, the author claims that each of these languages is "dead". There is a "Cause of Death" section for each language, which doesn't allow for another conclusion. By listing languages like ALGOL, APL, CLU, or Simula, the author implies that he means by "dead" "no longer in practical use, or just as an academic/historic curiosity". The article contradicts itself by listing languages like COBOL, BASIC, PL/I, Smalltalk, Pascal, or ML, for which there is still significant practical use, even with investments for new features and continuation of the language and its applications. The article actually disqualifies by listing COBOL or Pascal as "mostly dead", because there is still a large market and significant investment in these languages (companies such as Microfocus and Embarcadero make good money from them). It is misleading and unscientific to equate “no longer mainstream” with “no longer in use.” This makes the article seem arbitrary, poorly researched, and the author not credible.
In the Smalltalk section, it says that Python isn't 'true' OO like Smalltalk... who considers this to be the case? In Python, everything (including functions, and classes), is an object
I think this comes from the fact, that Alan Kay does not think it is OO. There is no legal definition, but Python does other have Smalltalk-like 'method_missing' or 'responds_to' methods. If you think OOP means messages and late-binding, that feature is important.
I think this refers to encapsulation in Python, 'private' methods aren't really private, any user can poke around in there.
Old justification: https://mail.python.org/pipermail/tutor/2003-October/025932....
>Nothing is really private in python. No class or class instance can keep you away from all what's inside (this makes introspection possible and powerful). Python trusts you. It says "hey, if you want to go poking around in dark places, I'm gonna trust that you've got a good reason and you're not making trouble."
>After all, we're all consenting adults here.
Kinda surprised to not see Forth listed.
Forth was neat, but it was a bit of an evolutionary dead end. I'm not aware of any significant concepts from Forth which were adopted by other, later programming languages.
RPL (Reverse Polish Lisp, a high level language for HP calculators) possibly drew on it a bit, though the main antecedents are RPN and Lisp, and possibly Poplog (a Poplog guru was at HP at the time, but I don't know if he contributed).
PostScript
Or Lisp. Lisp is definitely not dead, but was definitely very influential.
The article does touch on that:
"COBOL was one of the four “mother” languages, along with ALGOL, FORTRAN, and LISP."
Imho Lisp is deader than COBOL. Especially now that we've learned you can do the really hard and interesting bits of AI with high-performance number crunching in C++ and CUDA.
I wrote Lisp this morning to make Emacs do a thing. In other venues, people use Lisp to script AutoCAD.
Lisp isn't as widely used as, say, Python, but it's still something a lot of people touch every single day.
And Clojure
Wheee does perk fit in this scheme of dying languages. I see fewer and fewer new packages written in Perl and lots of unmaintained packages in Cpan. It seems obvious that the language is dying a slow death .
The 5 -> 6 transition took out a lot of its momentum, and it was gradually eclipsed by two other dynamic weakly typed languages which were less alarming to read: Python and Javascript.
Dang I wanted it to keep going
Previously: https://news.ycombinator.com/item?id=22690229
(There are a few other threads with a smaller number of comments.)
Seeing more than one of the languages i have used on this list makes me feel very old.
+1 for basic, first used in gradeschool.
+1 for pascal (highschool)
+1 for lisp (compsci 101)
One day Perl will be on this list
I'm not convinced. Which languages have Perl influenced, really? All the things that other languages seem to draw from Perl could just as well have come from e.g. awk, Smalltalk, and other languages that came before Perl.
Most of the uniquely Perly things (topicalisation, autovivification, regexes in the language syntax, context sensitivity) haven't been picked up by other languages.
The only really Perly thing that was picked up elsewhere is CPAN, but that's not part of Perl-the-programming-language but Perl-the-community.
(Oh I guess PHP picked up sigils from Perl but misunderstood them and that was the end of that.)
Perl's big influence is Perl-Compatible Regular Expressions, to the point many younger people just assume "regular expression" means "pcre".
In terms of direct language impact, Ruby code rarely shows that these days, but it was essentially designed in a way to easily migrate from Perl ways
Regexes in syntax / regex literals made it into JS and Ruby (at least).
> Which languages have Perl influenced, really?
Raku is a direct descendant.
I came here to write - I think awk would fit in the list.
Awk is sold on pattern matching, and there are earlier technologies that do pattern-matching - ML, SNOBOL.
But awk's historic significance is something else: it was the embryonic scripting language. You could use it in an imperative manner, and in 1977 that showed a new path to interacting with a unix system. It allowed you to combine arithmetic, string manipulation, and limited forms of structured data processing in a single process without using a compiler.
Two language schools grew from imperative awk. (1) The scripting language that expose convenient access to filesystem and OS syscalls like perl/pike/python/ruby; (2) The tool control languages like tcl/lua/io.
It may also have influenced shell programming. Note that awk was released before the Bourne shell.
I would like to agree – I'm always surprised when I realise how old awk it. It feels like an incredibly modern language. It's also obvious that it inspired the likes of dtrace and bpftrace.
That said, I don't know how many other languages explicitly have cited awk as an inspiration, which was the criterion for this list.
You think awk would fit the list but then go on to show how useful it was and still is today.
I often read answers to questions all over the internet where awk is part of the solution. Mainly serious programmers using BSD and Linux.
My comment did not talk about where awk is useful today.
Unix gurus will recommend awk as a pattern matching and substitution tool.
But my comment was about awk the vanguard imperative scripting language. I don't know of anyone who recommends use of awk's imperative style over python in 2025.
As an exercise, I tried writing a simple roguelike in awk in an imperative style. Within twenty minutes, it felt obvious where perl came from.
If we assume peak Perl was in the 00s, say 2005, an impressionable teenager of ~15 learning by then probably will keep using it for the rest of their life, even in absence of uptake by new people. Assuming a lifespan of 85, I estimate this day won't arrive before the 2070s.
I think peak Perl was before then, but that's about when Perl fell off the map and started getting replaced by Python or PHP to replace CGI since it had some syntactic overlap.
This is when I started professionally and we were asked to replace "slow, old Perl scripts" As a new entrant, I didn't ask many questions, but I also didn't see any of the replacements as improvements in any way. I think the # of devs left to take over messy Perl projects was shrinking.
As you might imagine, this job involved a lot of text processing. People still point to that as the arrow in Perl's quiver, but it seems especially quaint today since any language I'd reach for would blow it out of the water in terms of flexibility and ease of use.
That's how I remeber the time line.
But I thought maybe the end of the 00s was when ror started showing up.
Mid 2000s I think i was learning php and lamp stack. Perl was already kind of old
I started using it in the mid-90s, and used it extensively at work as long as I could, but by 2012 I gave up the fight. I still break it out once in a great while for a quick text transformation, but it’s so rusty in my memory that I rely on an LLM to remind me of the syntax.
Classic HN. A comment meant in half jest is dissected technically and literally and down voted. LOL Anyways, I'm glad the Perl crowd is alive and well!
As will Python and many others.
Surprised F# wasn't on list.
Since F# was like the test bed, for many features that got moved to C# once proven.