> Well, there are quite a lot of rumors and stigma surrounding COBOL. This intrigued me to find out more about this language, which is best done with some sort of project, in my opinion. You heard right - I had no prior COBOL experience going into this.
I hope they'd write an article about any insights they gained. Like them, I hear of these rumors and stigma, and would be intrigued to learn what a new person to COBOL encountered while implementing this rather complex first project.
One of the rumoured stigma is that the object-oriented flavour of COBOL goes by the unwieldy name of ADD ONE TO COBOL YIELDING COBOL.
At least it doesn't have the unrumoured stigma of older FORTRANs, which ignored whitespace, allowing:
DO 10 I=1.10
to silently compile an assignment: DO10I = 1.10
instead of signalling an error for the syntax of the loop the flight software programmer had intended: DO 10 I=1,10
> One of the rumoured stigma is that the object-oriented flavour of COBOL goes by the unwieldy name of ADD ONE TO COBOL YIELDING COBOL.
Which is a joke. Rather than an extension, the COBOL standard itself incorporates OO support, since COBOL 2002. The COBOL standards committee began work on the object-oriented features in the early 1990s, and by the mid-1990s some vendors (Micro Focus, Fujitsu, IBM) were already shipping OO support based on drafts of the COBOL 2002 standard. Unfortunately, one problem with all the COBOL standards since COBOL 85 (2002, 2014 and 2023), is no vendor ever fully implements them. In part that is due to lack of market demand, in part it is because NIST stopped funding its freely available test suite after COBOL 85, which removed a lot of the pressure on vendors to conform to the standard.
No one seems to have written a Minecraft server in FORTRAN yet... but I think your comment just gave some people here ideas.
Instead of FORTRAN, someone should try writing a Minecraft server in something like ALGOL or FORTH.
Algol 68 actually isn't too bad of a language to work with, and there's a modern interpreter easily available. Unfortunately it lacks all support for reading and manipulating binary data so I think a Minecraft server would be nearly impossible.
Or APL.
If you have to go to 1977 or prior to slag a language, there are tons of languages that will disappoint you.
PL/I
I would assess C++ has already outpaced PL/I complexity, and I do enjoy using C++.
You can get a C++ compiler which is (more or less) correct, I'm not sure that was ever quite true of PL/I.
It was for IBM and Unisys, I imagine.
And then there is the whole DoD security assessment of Multics versus UNIX, where PL/I did play a major role versus C, so the compiler did work correctly enough.
Just this week we're discussing a VC++ miscompilation on Reddit.
IBM are still building and maintaining their PL/I compiler for z/OS, today. Though it is only compliant with specs up to 1979. The '87 ISO is only partially adopted.
Yeah, I love these insights.
If you are interested, here are insights from making a COBOL to C# compiler: https://github.com/otterkit/otterkit-cobol/issues/40
I am now convinced that COBOL is just a high level assembler.
> I am now convinced that COBOL is just a high level assembler
In fairness, I think to some extent everything was just a high level assembler in its day, and then it never changed:)
This is Awesome.
For my high school graduation project, I wrote a full COBOL system to automate soccer betting odds. Long past its prime, but my school hadn’t quite caught up with the times.
It was hilariously out of place, but I loved every line of it. There’s something oddly satisfying about a language that whispers, “Remember punched cards?” as you type.
I'm sure this is some kind of fallacy, but I feel I quite often see ostensibly impressive small side projects like this written in simple plain languages like C (or here COBOL). Every similar, e.g., Rust project I see seems almost non-functional despite having 10x the SLOC.
My working theory is that simpler languages lend themselves to blueprinting ideas and getting something working even with an ugly messy codebase, whereas modern languages force you to write code that will last longer. Or maybe modern languages are just doing something wrong.
A minecraft server isn't exactly a small side project. There are some in works for 3-5 years and they are not yet complete, some have very specific features (like https://github.com/MCHPR/MCHPRS which is meant for redstone showcases). This COBOL server doesn't yet implement lighting and that's one of the hardest parts since mob generation also depends on it. It also didn't fully implement some blocks. You need years to finish a minecraft server so getting something done fast isn't the best path along the way.
Rust has really struggled to break through in gamedev. Rust's core premise is trading off dev speed and flexibility for memory safety, but it turns out that dev speed and flexibility is far more important in gamedev than memory safety.
If you have a formally specified microkernel that's already blueprinted to within an inch of its life, Rust is probably a great choice for you. If, on the other hand, you need to rapidly throw slime at a wall to see what sticks and makes for fun gameplay, Rust is going to make that much more challenging than virtually any other language, and the benefits are far from obvious (your quick and dirty gameplay slice that took much longer to make is slightly more memory safe?).
I'm not the only person who's done gamedev in Rust and has since definitely turned away from the language for that use case, see e.g. "Leaving Rust gamedev after 3 years" [0], which remains one of the most widely discussed and liked Hacker News posts about Rust to date.
More broadly, it's obvious that Rust is far more hyped than Cobol is. That means there are many examples of valiant attempts at OSS or hobby projects in Rust by the devs most susceptible to hype (generally enthusiastic beginners). Conversely, writing a Minecraft server in Cobol requires slightly more whimsy and derring-do, which tends to correlate with greater experience.
> Rust's core premise is trading off dev speed and flexibility for memory safety
Rust's main competitor in gamedev is C++, which is not especially known for its "dev speed and flexibility". There are ways to do fast, iterative development in Rust, they just involve quite a bit of boilerplate (to mark all the places where you're giving up some amount of low-level performance in the name of flexibility). If anything, the main disadvantage of Rust is simply that its community, while highly committed to technical excellency, is nonetheless orders of magnitude smaller than the huge amount of C++ developers.
I respectfully disagree, emphasis on respectfully. C++ absolutely does have Rust beat on dev speed on flexibility. I would much rather prototype a quick and dirty gameplay slice in C++ over Rust, and I think so would most people. That flexibility comes at a cost, of course. C++ is three languages in a trenchcoat, waiting for you to turn away so it can bash you over the head and mug you.
There are ways to do anything in Rust, but that doesn't change the fact that it's a language less suited for fast iteration than many others. It's a good tool, just not the best tool for this particular job, and there's no obvious reason to use it over those tools that are a better fit. For the vast majority of games, something like C# is going to make rapid prototyping far easier than either Rust or C++, with far less overhead (overhead = time = money, and gamedev is a pretty cutthroat business).
Rust's community is not particularly small, all things considered, and even adjusting for size, Rust really does under-perform in gamedev. Rust's definition of 'technical excellence' often revolves largely around memory safety, which is something I definitely want from my air traffic control systems, but which barely matters in gamedev at all. There are other things that constitute technical excellence in gamedev, and these tend to be difficult or undesirable in idiomatic Rust (often precisely because they prioritise other goals over memory safety). Rust is a fine language, I'm fairly fond of it, but it's just not a good fit for this use case. And indeed, we see the consequences of that bear out in practice.
(I'd also encourage you to read the comment thread I link above - lots of experienced people agree on this one.)
> Rust's community is not particularly small, all things considered
It absolutely is. Your linked thread has people describing the 3D- or 2D-rendering stacks available in Rust as half-baked in some way or other, with other folks saying that it's hard or infeasible to get Rust code on consoles where the manufacturer forces you to use their dev SDK. This is what a smaller community looks like in practical terms.
Being productive with C++ tools and dependencies has much steeper learning curve than Rust as a language, and has you spend way more effort on a constant basis.
I can be productive in C#/F#, Rust, Go or TypeScript in a fraction of time it takes to accomplish so in C++ the moment you go beyond something trivial. It’s not the fault of the language per se but of everything around it.
It’s one of the reasons some people are so upset about C++ and Rust, in my opinion. Working with C and C++ involves huge amount of pain and hassle that is absolutely not required to accomplish whatever task you are dealing with.
I don't think it's a fallacy.
As someone who has 2 WIP games in Rust, I noticed that getting a very minimal gameplay prototype was pretty easy (given a sane engine choice)* but then adding features made the code balloon in size. Switching from singleplayer to multiplayer was a mess. I also fell for some fads and had to waste time removing them.
And Rust really, really doesn't like heavily interconnected graphs of game objects where a game event can lead to updating multiple other types. There is friction in everything. One choice was to just live with it and write slightly more code. The other was trying to find some systematic solution that would take more code upfront and save me time later.
I chose the second options, went through some experiments but it feels like the breakeven point is way further than makes sense for a small one-person project.
*Even within one language, there can be order of magnitude differences. Rust has 2 usable 3D engines, one is well known, has dozens of contributors and gets donations that can supplant a Bay area salary. The other is written by mostly one guy who alternates between living from savings and working full time. The first engine heavily focuses on advertising and has been promising various features for years with little to show for it. The other is ahead in both number of features and implementation quality.
I think it's attitude. Some people code for the fun of it, some people have a clear goal in mind and focus on achieving it, some code for money, some code for public recognition, etc. I don't think an ugly messy codebase is necessary but there's a productive middle ground. People who focus on showmanship instead are more likely to chase fads and fancy architectures.
I know one of those two is Bevy, what is the other one?
Yes, the other is Fyrox. I don't think there are other viable 3D engines in Rust but at this point the hype surrounding Bevy is so large people wouldn't notice (and I stopped following the scene for the most part).
There are also bindings to Godot which is probably the way to go if you actually wanna get shit done. I talked to their authors and they avoided promoting it on Reddit for a while because the hype was bordering on harassment.
Likely Fyrox: https://github.com/FyroxEngine/Fyrox
Oh god, look at the authors contribution graph from last year. He is the living meme.
He's doing it because he likes learning and coding. And he knows what he's doing. Rust has huge issues attracting experienced gamedevs (unlike any other area) for some reason and mrDIMAS is one of the few exception, originally from Larian studios which made Baldur's Gate 3.
He used to have a Patreon with a modest goal of $1000 per month and was making decent progress towards it but then the war started and after it was clear he was not getting any of the money any time seen, he had to switch to other platforms but by then the opportunity was gone.
He worked for a while for another big game studio to save up and then went back to Fyrox.
I think it's just the difference between real hackers™ who often have a fancy for simpler languages and can write the whole universe in anything you give them, versus code monkeys chasing the latest hype cycle. (For reference, I put myself into the second category.)
The author is clearly talented and experienced building server-side systems. So while the language itself is new to the author, the design and architecture would be largely the same if this project were written in Ruby, C, Python, or Javascript.
Had they built something outside of their comfort zone, the results would probably be a lot more messy.
People that get things done usually don't care about code quality. I noticed myself in a place where I just never finished anything anymore by trying to write something to last. Over the years I've found a good balance, but I've found that iterating over garbage will eventually turn into something good and I've been doing so since.
My previous employer certainly got things done. They got so many things done, that I found it impossible to add features to their codebase.
I'd say it's a skill issue, but that's only because I've worked on decompiled applications and made it work. I've worked with assembly and made it work. I hook google chrome with pdb resolving and made it work.
It took me 10 years to get to this point, it is not a reasonable expectation to have of others.
Is good software a Dorodango? A pile of mud that has polished enough to be pretty.
Funny, but no. Software never becomes a smooth sphere, instead you keep adding more chunks to it that are barely holding on that somewhat resembles a sphere... eventually. Wireguard was great from the start.
I've sometimes thought perhaps coding would be much simpler if instead of creating a new programming language, we instead create a fantasy workstation akin to PICO-8/Picotron geared towards building business applications.
On the user interface side, you only have to code to the fantasy workstation's fixed UI and it can be smart enough to automatically be responsive for different screen sizes. Since it's geared towards business apps that are primarily forms, it doesn't have to care about all the edge cases of the web's presentation layer.
Concepts like durable messages queues (like Temporal.io) could be first class citizens so instead of the distributed mess of lambdas, queues, step functions, etc, you just have basic code that can be web scale.
I haven't given it too much thought. It just seemed like something interesting to explore.
There have many attempts to do something like this. I'm arguably working in one of them now, a so-called low code environment. Much of it amounts to storing functions in a database instead of in a file tree, and losing access to modern conveniences like a language server and revision control. The latter quite a bit like working in the PICO-8 editor, really.
I feel like this is a case of the no silver bullet principle. Like, going from a legacy/pre-modern/old language to modern, dynamic to static typed, etc. isn't going to magically make your projects succeed.
Maybe these failed projects in modern languages are the graveyards of those who believed that silver bullets exist.
Or maybe Rust projects are filled with people of first and foremost want to learn Rust, and the actual completion of the project is secondary.
It’s worth noting that a lot of rust frameworks have code generators and the like that emit a lot of lines quickly.
I’m working on a web app with Axum, Diesel (type-safety on my Postgres stuff), and Leptos (in “islands” mode, allows me to write my server side HTML using JSX-like syntax within Rust, and client side WASM). It’s actually incredible how much it can accomplish with a relatively simple file tree.
I originally started with Loco, which is rails-like. It spits out something like 80 files every time you make a scaffold. It felt exactly like what you describe — too much code to do something that should be simple.
I think what you have identified is a truism.
Simple languages present a very lean cognitive load.
They also do not offer a bazillion choices, and that is perhaps their greatest appeal! When the language is simple, a programmer can apply more brain power to actualize the idea.
You can't agonize on the best way to shave a yak if there's only one way to do it.
Bada boom! Correctamundo!
Same as dynamically typed vs statically typed languages.
I think the difference is that rust is hyped - so all you need to do to get attention is to start on it, since "rust" is kind of a selling point all by itself.
Nobody cares about c or cobol, so you have to do something useful to get attention.
In the end i suspect it is just survivorship bias.
https://raw.githubusercontent.com/meyfa/CobolCraft/main/src/...
It's actually not too difficult to understand for anyone with a background in procedural languages, and reminds me somewhat about some game servers written in VB that I saw ~2 decades ago.
I stopped using COBOL in 1978 and NEVER admitted to even knowing the language forever after. I'm making the sign of the cross and heading for a strong cup of coffee in the hope that I will never see this code :-)
(impressive that you did this)
Roast me but the code is very readable. Compare with some modern languages where you have to stare at it for minutes to understand what's going on.
I once worked at a FAANG company where I had access to some of the leading C++ experts in the world (some were part of the international language committee). I emailed an internal C++ list asking whether a certain line of code would create a memory leak. It was a pure use of STL templates and casting. The experts could NOT agree whether I was doing it correctly. Some expected a leak, others didn't.
This true little story says everything about C++. JavaScript is full of these things too.
That’s every language. Experts will argue and pontificate the nuances of each branching instruction.
But they all should agree on whether a certain line is correct (ie causes a memory leak).
That was always it's strength. I started programming in 1976, trained in COBOL and ICL PLAN, used punched cards, and mop terminals once we got out of training. 100% of our programs were batch programs. There was a huge bias towards readability, so that anyone of us could read the source code and understand it. That readability was offset somewhat by the necessity to read and understand the core dumps produced when a program failed. At best you would be able to trace a failure to a specific line of code. Thus the habit of dry running programs was hard wired into you. When I left the government institution to move into commercial programming,it was still COBOL and batch programs until the early 80s.I spent 3 years on overnight support and that was when COBOL proved it's worth, you could pick up any previously unseen listing and the core dump and usually fix it pretty quickly, caveat it was always a tactical fix.
That’s why I like Ada (and VHDL). Somewhat verbose, perhaps, but much more readable than more “modern” languages.
VHDL has special place in my heart coming from FPGA world. Oddly enough, I never thought of Ada and VHDL as similar, mhhmmm.
As a teenager I swore by Turbo Pascal's begin/end syntax and C++ at the time, with its overloads and macro system was just noise to me.
Many years have passed since but C++ with its overloads and macro system is still largely noise to me.
Overloading can be easily abused, but the very complicated expressions that can appear in programs for scientific/technical computing are immensely more readable when using operator overloading like in C++ instead of using named functions, like in languages that forbid operator overloading, e.g. Java.
In scientific/technical computing you have frequently, even in the same expression, dozens of different kinds of additions, multiplications, divisions, etc., between scalars, vectors, matrices, tensors, complex numbers, various kinds of physical quantities, and so on. Without operator overloading the expressions can become huge, extending on a great number of program lines and they can be very difficult to understand.
Also, Pascal's method of using a single kind of statement brackets, i.e. "begin" and "end" (renamed in C as "{" and "}"), which has been inherited from Algol 60, is wrong for program readability.
The right method has been introduced by Algol 68 and it was inherited by languages like Ada. In such languages there are several kinds of statement brackets, like in the UNIX shell, where "if" and "fi" are brackets for conditional statements, "do" and "done" are brackets for loops and "case" and "esac" are brackets for selection statements. This is much more readable, especially when there is a big loop that can not be seen in a single page of code, so you see only its end, which is also the end of many other kinds of program structures, like nested loops, conditional statements or selection statements.
To get the same readability in languages like Pascal and C, you can add comments to the final braces or "end" of the program structures, but this is much more cumbersome than in a language with several kinds of statement brackets, where the bracket pairs will normally be added automatically by your text editor.
It's precisely when "you have frequently, even in the same expression, dozens of different kinds of additions, multiplications, divisions, etc., between scalars, vectors, matrices, tensors, complex numbers, various kinds of physical quantities, and so on" that operator overloading should be driven by custom syntax macros. (E.g. in a Rust-like language you might define math_expr!(...), float_expr!(...), matrix_expr!(...) etc. syntax macros, each with its own special semantics for operator symbols.) That way a program can directly express the variety of overloading that's relevant in any given context, as opposed to relying on fragile hacks like special __add__ and __mul__ "traits" that are dispatched in a type-dependent way.
And, if you think the wrapper $type_expr! is too verbose, just use [blackboard bold M](...) or something.
Agda is my favorite example of a language that _judiciously_ uses Unicode symbols to express "concepts similar to something but is actually substantially different so let's not get them confused"... as opposed to other theorem proving languages of its time [1]
[1] https://people.inf.elte.hu/divip/AgdaTutorial/Functions.Equa...
> ”do” and “done”
“do” and “od” IIRC
#define begin {
#define end }
Are you Stephen Bourne?
COBOL was designed to be written and read by non-programmers. That was the theory, at least.
And let’s not forget, so was FORTRAN. (Programmers coded in Assembler back then.)
Fortran (FORmula TRANslation) was intended for a very peculiar subset of 'non-programmers': mathematicians, engineers and the like. It ends up being a pretty good impedance match for people who are use to dealing with terse forms of expression already.
and and COBOL (COmmon Business-Lriented Language) was intended for the subset of non-programmers dealing with things like money and inventory.
MOVE FUNCTION MIN(BLOCK-ENTRY-MINIMUM-STATE-ID(LK-BLOCK), STATE-ID) TO BLOCK-ENTRY-MINIMUM-STATE-ID(LK-BLOCK)
I learned a bit of COBOL during high school... in a small town in Pakistan.
It was okay -- did a project that simulated some financial statements. I recall it being quirky but, to most humans (i.e. non-programmers), any programming language would be quite quirky so I don't understand the stigma around it.
I also learned C around the same time and that one stuck. :-)
COBOL actually looks like a very cool language ! Your code looks really well organised
Can't tell if it supports redstone (which I would consider basic functionality in a Minecraft server) - isn't called out in the README and the code only makes mention of redstone torches (which also function as a light source).
I think the README tacitly says "redstone is not supported" since those blocks have multiple states:
Note that blocks with multiple states, orientations, or interactive blocks require large amounts of specialized code to make them behave properly, which is way beyond the scope of this project. Some are supported, however:
torches (all variants)
slabs (all variants)
stairs (non-connecting)
rotated pillars, such as logs or basalt
buttons (non-interactive)
doors (including interaction)
trapdoors (including interaction)
beds
In particular, it seems buttons don't emit redstone signals.My kids are just getting into Minecraft on the switch and I am a bit confused by the options for hosting a server. I’d love to hop on and play with them on a self-hosted server, what’s the current best way to do that or is that level of cross-play between PC and Switch on a self-hosted solution even possible?
Edit: I don’t currently play, which is why I’m not familiar with what’s out there
There's actually two completely separate games (Java edition and Bedrock) that can't cross-play with each other. You can buy a Bedrock edition Minecraft for Windows that should be able to cross-play with the switch.
Java Edition is the original game and it's fairly easy to either host your own server (The dedicated server is just a .jar you run) or pay for a server ($10-40/mo) using a game server host.
Unsure about bedrock, there's some instructions here (https://www.reddit.com/r/Minecraft/wiki/bds/#wiki_bedrock_de...).
There are extensions/plug-ins to the standard Java server, allowing Bedrock players, and xbox accounts to join:
(Plus Floodgate so they don't need a Java account)
I just setup a new family minecraft server last week and have successfully hosted java and bedrock players simultaneously (one coming in from a Nintendo switch, another from their phone and several from Java clients).
There are also plug-ins allowing older (or newer) client versions to connect to your server as well (ViaVersion, ViaBackwards).
Oh that’s awesome! I’ll have to try that.
FWIW with the Java version you can easily create a LAN Server in game, by joining a world and then opening it to LAN.
https://help.minecraft.net/hc/en-us/articles/4410317081741-H...
EDIT: I just realized this works on Bedrock too, but don't know how the support on the Switch is
https://help.minecraft.net/hc/en-us/articles/4410316619533-H...
Bedrock has self-hosted servers available too: https://www.minecraft.net/en-us/download/server/bedrock
They're pre-compiled Windows and Linux binaries, but if I'm remembering correctly, I think they're statically linked, and I had no trouble running the Ubuntu one on my Fedora system.
I used to keep a Bedrock server running on my desktop, and my son and I could pop in from a phone, tablet, or laptop whenever we wanted to.
Cross-play between the PC "Java" edition and the console "Bedrock" edition is not really possible, so the best option is to play it on Bedrock on both platforms. On Windows you can download the Bedrock edition and play it there, it seems technically possible but a little complicated to play Bedrock edition on Mac / Linux too. (Some Google searching suggests you'll need to run it in a VM on those OSs)
Easiest way to get a server is to pay Minecraft (Microsoft) to host it for you via realms: https://www.minecraft.net/en-us/realms
But self hosting is supported too, there are official server binaries you can download from Minecraft here: https://www.minecraft.net/en-us/download/server/bedrock
There are actually cross play solutions such as https://github.com/GeyserMC/Geyser, there are some limitation, but not too many. Basically disabling the features that only exists in one version.
Do you always speak so confidently?
I think options are limited from the switch. You can't connect to arbitrary servers from the console version - just some curated public servers or by paying for Realms. You can hack around this with a DNS server that redirects the curated servers, but starts to get sketchy.
You also need the paid switch online for any sort of network play.
Check out minetest open source too, which is similar but separate game like minecraft. You can self-host the server.
or, as the kids like to call it, the "We Have Minecraft At Home" version.
Tell them they can just call it "Luanti" now ;-) [1]
It's also becoming more and more "not just Minecraft". For instance this entry from the latest GameJam, which is a nice little shooter [2]
[2] https://content.luanti.org/packages/Sumianvoice/extra_ordina...
I got a Java server running on my ubuntu machine in the basement, but the child prefers to play on XBox.
Rather than spend my time fucking around with setting up a server, I just gave Microsoft $3 a month to use Realms.
I've got enough chores.
You have to play the Java version on pc to connect to your own server software.
That is incorrect. There are several bedrock compatible servers available. See the official one here: https://www.minecraft.net/en-us/download/server/bedrock
They should have called it COBOLstone
I'm always hearing that Cobol programmers get high salaries, because they are so rare.
Did this project generate any flood of offers for work?
As others have pointed out, probably more the intricacies of the existing business logic, plus the understanding of the mainframe systems that it usually runs on. Otherwise you could just build a cross compiler and call it day.
>As others have pointed out, probably more the intricacies of the existing business logic, plus the understanding of the mainframe systems that it usually runs on.
Honestly just knowing COBOL is how you get your foot into a well paying job where you learn that stuff from the old timers.
It's not Cobol programmers that are rare, it's a shortage of people that can understand the business logic. Cobol is often used in very complicated business operations.
Not because it’s necessarily more suited for those business operations, but the complexity and risk of refactoring means it remains in COBOL
Those who do are valued not for their programming skills but their knowledge of the (very complex, largely undocumented) system and its business environment.
I love that there are unit tests.
Actually the writer of this Github Repo is wrong; Cobol has been exactly there to manipulate bits and bytes on the lowest level since forever and is very efficient at it.
Nevertheless, congrats to this achievement.
What an idea. Absolutely unbelievable - I've written an API framework in COBOL, but this is something else. Very impressive bit of work.
I upvoted this before I even looked at the link.
Why does this look like an elaborate PL/SQL?
Because both COBOL and SQL originate with the same design school of "try to make syntax look more like natural English" that was in vogue at the time, with the idea that it'd let non-programmers write code.
40 years from now, C-suites all over the world will be saying "COBOL Minecraft servers are dead."
Lol, big question is would it work on zOS as well.
Would probably take a few hours to get Hercules and some old z/OS and check, but it's doable.
Or ask my colleague who is zos sysprog to run it on our own ;)
Or just run on zlinux
It can run on zLinux, but that's not as interesting as z/OS.
Docker isn't a platform portable experience fwiw
Your Linux based docker containers can work on all major platforms, how is that not a portable experience?
There are more OSs than Windows and Linux :)
What other portability platform supports all of them? Only the browser if anything, so we build everything in Node and Electron?
What platform are you thinking you'd want to use this on that docker doesn't support
FreeBSD and Solaris mostly for me. There are many other OSs though than Linux and Windows.
FreeBSD should work with podman, which isn't literally docker but is probably drop in for this use and can use the Linux compatibility layer.
A quick turned up https://www.dbi-services.com/blog/freebsd-basics-8-running-l... which seems to confirm.
I frequently am bothered that I can't run docker on mainline Android.
(Without rooting, qemu, or other malarkey)