> while that shown in blue is the stapled notarisation ticket (optional)
This is correct, but practically speaking non-notarized apps are pretty terrible to use for a user enough so that this isn't optional and you're going to pay your $99/yr Apple tax.
(This only applies to distributed software, if you are only building and running apps for your own personal use, its not bad because macOS lets you do that without the scary warnings)
For users who aren't aware of notarization, your app looks straight up broken. See screenshots in the Apple support site here: https://support.apple.com/en-us/102445
For users who are aware, you used to be able to right click and "run" apps and nowadays you need to actually go all the way into system settings to allow it: https://developer.apple.com/news/?id=saqachfa
I'm generally a fan of what Apple does for security but I think notarization specifically for apps outside the App Store has been a net negative for all parties involved. I'd love to hear a refutation to that because I've tried to find concrete evidence that notarization has helped prevent real issues and haven't been able to yet.
I thought the macOS notarization process was annoying until we started shipping Windows releases.
It’s basically pay to play to get in the good graces of Windows Defender.
I think all-in it was over $1k upfront to get the various certs. The cert company has to do a pretty invasive verification process for both you and your company.
Then — you are required to use a hardware token to sign the releases. This effectively means we have one team member who can publish a release currently.
The cert company can lock your key as well for arbitrary reasons which prevents you from being able to make a release! Scary if the release you’re putting out is a security patch.
I’ll take the macOS ecosystem any day of the week.
The situation on Windows got remarkably better and cheaper recently-ish with the addition of Azure code signing. Instead of hundreds or thousands for a cert it’s $10/month, if you meet the requirements (I think the business must have existed for some number of years first, and some other things).
If you go this route I highly recommend this article, because navigating through Azure to actually set it up is like getting through a maze. https://melatonin.dev/blog/code-signing-on-windows-with-azur...
That's not easier and cheaper than before. That's how it's always been only now you can buy the cert through Azure.
For an individual the Apple code signing process is a lot easier and more accessible since I couldn't buy a code signing certificate for Windows without being registered as a business.
> That's how it's always been only now you can buy the cert through Azure.
Where can you get an EV cert for $120/year? Last time I checked, all the places were more expensive and then you also had to deal with a hardware token.
Lest we talk past each other: it's true that it used to be sufficient to buy a non-EV cert for around the same money, where it didn't require a hardware token, and that was good enough... but they changed the rules in 2023.
Thanks for the link, I see only available to basically US, Canada and EU though.
As you said, you need to have a proper legal entity for about 2 years before this becomes an option.
My low-stakes conspiracy theory is that MS is deliberately making this process awful to encourage submission of apps to the Microsoft Store since you only have to pay a one-time $100 fee there for code-signing. The downside is of course that you can only distribute via the MS store.
> it’s $10/month
So $120 a year but no it's only Apple with a "tAx"
Millions of Windows power users are accustomed to bypassing SmartScreen.
A macOS app distributed without a trusted signature will reach a far smaller audience, even of the proportionately smaller macOS user base, and that's largely due to deliberate design decisions by Apple in recent releases.
The EV cert system is truly terrible on Windows. Worst of all, getting an EV cert isn’t even enough to remove the scary warnings popping up for users! For that you still need to convince windows defender that you’re not a bad actor by getting installs on a large number of devices, which of course is a chicken-and-egg problem for software with a small number of users.
At least paying your dues to Apple guarantees a smooth user experience.
No, this information is wrong (unless it’s changed in the last 7 years). EV code signing certs are instantly trusted by Windows Defender.
Source: We tried a non-EV code signing certificate for our product used by only dozens of users at the time, never stopped showing scary warnings. When we got an EV, no more issues.
In case it makes a difference, we use DigiCert.
Not true for us. We EV cert sign (the more expensive one) and my CEO ( the only one left that uses Windows) had this very problem. Apparently the first time a newly signed binary is run it can take up to 15 minutes for defender to allow it. First time I saw this, it was really annoying and confusing.
Interesting.
I regularly download our signed installer often within a minute of it being made available, never noticed a delay.
Maybe it’s very the first time Windows Defender sees a particular org on a cert.
I renewed our cert literally on Friday, tested by making a new build of our installer and could instantly install it fine.
You sure there was no other non Windows default security software on your bosses machine?
They did change it, I think after some debacle with Nvidia pushing an update. They seem to want devs to submit their files via their portal now to get rid of the screen: https://www.microsoft.com/en-us/wdsi/filesubmission
Wow. I haven't written software for Windows in over a decade. I always thought Apple was alone in its invasive treatment of developers on their platform. Windows used to be "just post the exe on your web site, and you're good to go." I guess Microsoft has finally managed to aggressively insert themselves into the distribution process there, too. Sad to see.
> Windows used to be "just post the exe on your web site, and you're good to go."
That's also one of the main reasons why Windows was such a malware-ridden hellspace. Microsoft went the Apple route to security and it worked out.
At least Microsoft doesn't require you to dismiss the popup, open the system settings, click the "run anyway" button, and enter a password to run an unsigned executable. Just clicking "more details -> run anyway" still exists on the SmartScreen popup, even if they've hidden it well.
Despite Microsoft's best attempts, macOS still beats Windows when it comes to terribleness for running an executable.
I just wish these companies could solve the malware problem in a way that doesn't always involve inserting themselves as gatekeepers over what the user runs or doesn't run on the user's computer. I don't want any kind of ongoing relationship with my OS vendor once I buy their product, let alone have them decide for me what I can and cannot run.
I get that if you're distributing software to the wider public, you have to make sure these scary alerts don't pop up regardless of platform. But as a savvy user, I think the situation is still better on Windows. As far as I've seen there's still always a (small) link in these popups (I think it's SmartScreen?) to run anyway - no need to dig into settings before even trying to run it.
I solved it by putting a "How to install.rtf" file alongside the program.
Another alternative would be to bundle this app: https://github.com/alienator88/Sentinel
It allows to easily unlock it by drag'n'drop.
What is the subset of users who are going to investigate and read an rtf file but don’t know how to approve an application via system settings (or google to do so)?
I would say quite a lot of users because even the previous simple method of right clicking wasn't that known even by power users. Lot of them just selected "allow applications from anyone" in the settings (most likely just temporarily).
In one application I also offered an alternative by using a web app in case they were not comfortable with any of the option.
Also it's presented in a .dmg file where you have two icons, the app and the "How to install". I would say that's quite inviting for investigation :)
I have been trying to get people to realize that this is the same or worse for like a year now.
It’s unfortunate it’s come to this but Apple is hardly the worst of the two now.
You certainly don't need a hardware token, you can store it in any FIPS 140 Level 2+ stores. This includes stuff like Azure KeyVault and AWS KMS.
Azure Trusted Signing is 100% the best choice, but if for whatever reason you cannot use it, you can still use your own cloud store and hook in the signing tools. I wrote an article on using AWS KMS earlier this year: https://moonbase.sh/articles/signing-windows-binaries-using-...
TLDR: Doing this yourself requires a ~400-500$/year EV cert and miniscule cloud costs
Can confirm this, we use Azure KeyVault and are able to have Azure Pipelines use it to sign our release builds.
We’re (for the moment) a South African entity, so can’t use Azure Trusted Signing, but DigiCert has no issue with us using Azure KeyVault for our EV code signing certificate.
I had ours renewed just this week as it happens. Cost something like USD 840 before tax, don’t have a choice though and in the grand scheme of things it’s not a huge expense for a company.
That's right, there's a similar comparison between the iOS App Store and Android Play Store. Although the annual $99 fee is indeed expensive, the Play Store requires every app to find 12 users for 14 days of internal testing before submission for review, which is utterly incomprehensible, not to mention the constant warnings about inactive accounts potentially being disabled.
In my case, as a developer of a programming language that can compile to all supported platforms from any platform the signing (and notarization) is simply incompatible with the process.
Not only is such signing all about control (the Epic case is a great example of misuse and a reminder that anyone can be blocked by Apple) it is also anti-competitive to other programming languages.
I treat each platform as open only when it allows running unsigned binaries in a reasonable way (or self-signed, though that already has some baggage of needing to maintain the key). When it doesn't I simply don't support such platform.
Some closed platforms (iOS and Android[1]) can be still supported pretty well using PWAs because the apps are fullscreen and self-contained unlike the desktop.
[1] depending on if Google will provide a reasonable way to run self-signed apps, but the trust that it will remain open in the future is already severely damaged
The signing is definitely about control, as is all things with Apple, but there are security benefits. It's a pretty standard flow for dev tools to ad-hoc (self) sign binaries on macOS (either shelling out to codesign, or using a cross-platform tool like https://github.com/indygreg/apple-platform-rs). Nix handles that for me, for example.
It makes it easy for tools like Santa or Little Snitch to identify binaries, and gives the kernel/userspace a common language to chat process identity. You can configure similar for Linux: https://www.redhat.com/en/blog/how-use-linux-kernels-integri...
But Apple's system is centralized. It would be nice if you could add your own root keys! They stay pretty close to standard X.509.
I’m only aware of two times that Apple has revoked certificates for apps distributed outside of the App Store. One was for Facebook’s Research App. The other was for Google’s Screenwise Meter. Both apps were basically spyware for young teens.
In each case, Apple revoked the enterprise certificate for the company, which caused a lot of internal fallout beyond just the offending app, because internal tools were distributed the same way.
Something may have changed, though, because I see Screenwise Meter listed on the App Store for iOS.
https://www.wired.com/story/facebook-research-app-root-certi...
https://www.eff.org/deeplinks/2019/02/google-screenwise-unwi...
The article is about macOS apps, but you're talking about iOS apps.
Apple revokes macOS Developer ID code signing certificates all the time, mostly for malware, but occasionally for goodware, e.g., Charlie Monroe and HP printer drivers.
Also, infamously, Apple revoked the macOS Developer ID cert of Epic Games, as punishment for their iOS App Store dispute.
Maybe half of the 3rd party apps I have on my applications folder right now are not notarized. It’s really not that big of a deal.
It’s a friction point for potential customers, so we do it with our Electron based app,
The USD 99 annual fee is almost inconsequential, the painful part was getting a DUNS number (we’re a South African entity) and then getting it to work in a completely automated manner on our build server.
Fortunately, once set up it’s been almost no work since.
It is a big deal. You can no longer just right click apps to run them, you have to take a trip to a subpanel of system settings, after clicking though two different dialogs that are designed to scare you into thinking something is wrong (one mentions malware by name).
For normal users this might as well be impossible.
Remember, your average user needs a shortcut to /Applications inside the .dmg image otherwise they won’t know where to drag the app to to install it.
The stapled ticket is optional beyond notarization itself. If you notarize but don’t staple the ticket, users may need an internet connection to check the notarization status.
Apple’s Mac security team in general kind of sucks at their job. They are ineffectual at stopping real issues and make the flow for most users more annoying for little benefit.
The problem is not that it’s $99/year. The problem is that it requires strong ID, and if you are doing it as a company (ie if you don’t want Apple to publicize your ID name to everyone who uses your app) then you have to go through an invasive company verification process that you can fail for opaque reasons unrelated to fraud or anything bad.
The system sucks. I’d love to be able to sign my legitimate apps with my legitimate company, but I don’t wish to put the name on my passport onto the screens of millions of people, and my company (around and operating for 20-ish years now) doesn’t pass the Apple verification for some reason.
I also can’t use auto-enroll (DEP) MDM for this reason.
I think the lack of any human to talk to is the worst part of modern tech. Especially for business, where your income may depend on it. It's beyond cruel to prevent people from operating with no explanation of why and no way to find out how to fix it.
At least you can use your ID. If you want to get a code signing certificate for Microsoft at least in Switzerland all the CAs I tried using required me to be incorporated. I'm not sure how it is now but at least a few years ago I couldn't get a code signing certificate as an individual.
Well, what can I say except that the 80s, with their little independent app vendors shipping floppy disks in little baggies, are long behind us. Computers are now commonplace enough, with all the attendant dangers, that platform vendors are demanding a bit of accountability if you want to ship for their platforms, and unfortunately accountability means money and paperwork. The platform vendors are well within their rights to do so. They have a right to protect their reputations, and when malicious or buggy software appears on their platform, their reputation suffers. Half or more of the blue screens on Windows in the late 90s and early 2000s for instance, were due to buggy third-party drivers, yet Microsoft caught the blame for Windows crashing. It took a new driver model, standards on how drivers are expected to behave, and signed drivers to bring this under control.
The future is signed code with deep identity verification for every instruction that runs on a consumer device, from boot loader through to application code. Maybe web site JavaScript will be granted an exception (if it isn't JIT-compiled). This will be a good thing for most consumers. Until Nintendo cleaned out all the garbage and implemented strict controls on who may publish what on their console, the North American video game market was a ruin. The rest of computing is likely to follow suit, for similar reasons.
Congratulations on writing the most servile corporate apologia I've seen all week. This is a masterpiece of Stockholm syndrome.
"Accountability means money and paperwork." Beautiful. Just beautiful. You know what else means money and paperwork? A protection racket. "Nice app you got there, shame if something happened to it before it reached customers. That'll be 30% please." But sure, let's call extortion "accountability" because Tim Apple said so.
Your driver signing example is chef's kiss levels of missing the point. Microsoft said "hey, sign your drivers so we know they're not malware" they didn't say "only drivers we approve can run, and also we get a cut." You're comparing a bouncer checking IDs to a mafia don enforcing territory. These are not the same thing.
And oh my god, the Nintendo argument. You're seriously holding up Nintendo's lockout chip as consumer protection? The same lockout chip they used to squeeze third-party developers, control game production, and maintain an iron grip on pricing? "Until Nintendo cleaned out the garbage" yeah, they cleaned it out alright, straight into their own pockets. The video game crash was caused by publishers like Atari flooding the market with garbage like E.T., not by independent developers needing more "accountability."
"The future is signed code with deep identity verification for every instruction." Holy hell. You're not describing a security feature, you're describing a prison. You're literally fantasising about a world where every line of code needs corporate permission to execute. That's techno feudalism with RGB lighting.
This isn't about protecting anyone from bugs. It's about trillion-dollar companies convincing people like you that you need their permission to use the computer you bought. And somehow, SOMEHOW, you've decided this is good actually, and the 1980s with its freedom and innovation was the problem.
The fact that you think general-purpose computing is a "danger" that needs to be locked down says everything about how effectively these corporations have trained you to beg for your own chains.
> notarization has been a net negative for all parties involved
Notarization made it significantly harder to cross-compile apps for macOS from linux, which means people have to buy a lot of macOS hardware to run in CI instead of just using their existing linux CI to build mac binaries.
You also need to pay $99/year to notarize.
As such, I believe it's resulted in profit for Apple, so at least one of the parties involved has had some benefit from this setup.
Frankly I think Apple should keep going, developer licenses should cost $99 + 15% of your app's profit each year, and notarization should be a pro feature that requires a macbook pro or a mac pro to unlock.
I have quite a few gripes about the app structure while developing https://github.com/PeaceFounder/AppBundler.jl. The requirement (recommendation) to distribute shared libraries within the Frameworks folder, where each directory follows a strict structure, looks nice, but it’s a hassle to bundle the application that way. I am now using a Libraries folder to bypass this requirement, which appears during code signing.
My biggest issue, though, is Apple code signing. It’s already enough that a signature is attached to every binary, which seems wasteful. Why would anyone consider it better than keeping hashes of each file in one place and attaching the signature to them? Then there are entitlements, which are attached to the launcher binary when signed. Why couldn’t these just be stored in `Info.plist` or a separate file, instead of requiring this process?
And then there is notarisation, where at any point in the future, you might discover that your application bundle no longer passes, as requirements have become more stringent.
I've joined the Apple Developer program to be able to sign/notarize my Tauri app. And it's been three weeks that I'm failing to get the app notarized. I don't own a Mac so I'm using GitHub actions. Apparently the fact that notarization takes ages is very common the first time(s) an app is sent. I've spent nearly $100 on GitHub and my app still isn't notarized.
I contacted support and they don't want to help because I'm not using a Mac and using a third party framework (Tauri), even though it's just using xcrun, Apple's tool...
Also I've been unable to even use the notarization API to retrieve the submission logs and Apple didn't help for that either so far (they just disregarded my ticket).
I feel powerless and abused. This is the worst DX/CX I've had in years.
As a side note, authenticating against the notarization API is a nightmare. You get a PKCS8 that you have to use to create/sign a JWT and you're basically on your own... I had to build a little node program just to craft the JWT...
Dude, go buy a used Mac mini for $150, sign your stuff with it, and move on.
We can talk all day about how this _shouldn’t_ be necessary, but you are tilting at windmills trying to get Apple signing to work without Apple hardware. You’ve definitely spent more of your time trying to make this work than if you’d just buy a cheap Mac mini.
That first os screenshot made my heart sink; a reminder of how far we've fallen.
How I wish our operating systems still looked like this. Utilitarian, useful. No rounded corners and bubbly icons, reducing the useful space more and more each year.
The incredible quality of Mac hardware is the only thing keeping me from jumping to a thinkpad / omarchy setup.
Rounded corners are a utilitarian feature. Human vision is based on edge detection and corners unnaturally activate it more than necessary. It's basically like being continually poked in the eye.
The link tells the story how Bill Atkinson sped up drawing primitives on early Apple devices.
It does not support the claim that corners are in any way special for human vision. I’m very skeptical on that. AFAIK motion is most easily perceptible.
Ah well, the evidence for my claim is that I just told you. This particular claim is not a Steve Jobs story, but he would agree I think.
I did tell a true and previously unreported Steve Jobs story on reddit the other day and was voted to -10 and someone told me I was off my meds. In conclusion, Steve Jobs is a land of contrasts.
> AFAIK motion is most easily perceptible.
That's how it works for predators, but you can see things that are still if you're focusing on them. It's important to see corners in real life because they actually can poke you. Like a paper cut.
> It does not support the claim that corners are in any way special for human vision
Your binocular field of vision is a round rect: https://openbooks.lib.msu.edu/neuroscience/chapter/vision-ce...
I'm no fan of modern macOS, but I don't think that screenshot is great. There's too many lines everywhere and too little color, making it unclear where to focus your eye.
(What I am a fan of is Leopard-era Aqua, which is reasonably information dense but uses depth and color to help focus your attention.)
Count the pixels! Percentagewise, the window decorations and menu bar in that screenshot take up a lot more space than the modern equivalent does at 5K. If you're comparing it to to the current 14" Macbook Pro, it's closer, but Macbook Pro still wins - and still continues to hold its own even if the classic Mac is producing a 1280x1024 display. And this even though the Macbook Pro is the space-constraised pocket version! (Also note: you haven't even investigated the scaled display options yet)
Disclaimer: I have a desktop Mac, and I'm assuming the pixel counts are the same for the laptops.
(The window corners weren't always round, but there was a bit of rounding to the screen corners there from day 1: https://infinitemac.org/ - this really struck me when I first saw it, coming from the Atari ST.)
I've been on a 13" MBA for years as my daily, and I'm convinced that the usable screen real estate in macOS 26 is a significant downgrade. The window bezels feel like they're 1cm thick, as if they're meant to be finger-accurate rather than mouse-accurate. Probably telegraphing what's to come
Oops, very good point - I should have specified I am still using macOS 15.
It's a good point, I wouldn't have thought about it in terms of percentages, but I'd like to actually make use of the extra screen real-estate.
Computers are not just for business and corporate use. Believe it or not, some people use their computers for fun stuff.
I do feel like we’ve gone too far from utilitarian though. I’d like to see more practical UI design.
They should be designed as tools, first and foremost, imo. Not toys, not content consumption terminals. There are dedicated devices for those purposes
Most of the information on the screen is the useless 101101, nothing utilitarian about it
> When running a command tool in macOS, its Mach-O executable is launched by launchd, whose purpose is to run code.
That’s not what launchd’s main goal is and also not the path command line tools go through. They’re forked or spawned from your shell like any other UNIX system.
As side note, NeXTSTEP bundle system was the inspiration for Java's JAR files.
jars are just zip files renamed
Inspired by how NeXTSTEP bundles work.
People keep missing Java's ideas due to OpenSTEP collaboration before Java came to be.
https://cs.gmu.edu/~sean/stuff/java-objc.html
https://en.wikipedia.org/wiki/Distributed_Objects_Everywhere
I guess there’s a reason that Cocoa was called Cocoa… it’s also a hot beverage like Java, just sweeter ;)
also unlike java, cocoa doesn't cause jitters
JAR has additional structure to it, though it's mostly optional stuff, like metadata and code signing:
https://docs.oracle.com/en/java/javase/21/docs/specs/jar/jar...
Don't forget to delete META-INF!
Wait until you learn what an iOS app's .ipa file is.
What’s that saying, there are 3 kinds of files?
zips, text files, and binary files
A bit of a sharpshoot here, but Power Mac applications for classic Mac OS, including fat binaries, put the PPC executable code in the data fork. This was also true for CFM-68K binaries. Only old school 68K code went in CODE resources.
The old macos classic was a lot of fun, i remember using resedit on some apps at school.
Seeing bureaucracy explode like with the last diagram is rarely a good sign. As if I needed even more reasons to not "upgrade" to macOS 26.
Oh no my app bundle has more folders in it what else will macOS Tahoe do to my computer
Since those app bundle folders are merely the symptoms, this question isn't that unreasonable.
Nope, it is. This is a shallow question with no understanding behind why the additional folders were added, which was to bring a lot of functionality that used to litter your entire computer into a place where it is tied to the app that installed it. So the complaint is like looking at a trash can in the park and going “ew this garbage is ugly the park is going downhill” when the reality before that can existed was people threw the litter all over the park.
While this is the "standard" macOS App structure, it is not the only one that works.
IIRC, you can put stuff in arbitrary subfolders as long as you configure the RPATHs correctly. This works and passes notarization. I came across libname.dylib in the nonstandard location AppName.App/Contents/Libraries . Not to be confused with /Library or the recommended /Frameworks location. However, there are basically no benefits compared to using the recommended directory structure, and none of the 100+ macOS apps installed in my system have a /Libraries directory.
AFAIK, and not technically relevant, but iOS is very strict on this when submitting to the app store, and they’re not at all clear about it either, i had some very confusing and frustrating errors with self built frameworks with dynamic libraries. You also seem to be forbidden to use .dylib and must use the .framework format.
It’s picked up on submission automatically and not at review, but is a completely undocumented requirement.
No wonder everyone is building web apps. Operating systems are doing their best to make themselves obsolete.
As someone coding since 1986, with experience in native and Web development, and despite earning most of my money in Web, I assert it is lazyness.
There is no Web without operating systems, or do you imagine browsers run on pixie dust?
I’ll take this standardized directory format over the typical docker web app where there is no standardization and files can be strewn anywhere the developer wants. You `docker exec` into it and can’t find anything.
i think you meant everyone builds web apps because they want to target all platforms / hardware, they don’t care about performance (cpu usage, memory usage, etc), and they are easier to “deploy” in many respects.
Pros and cons to each. Not everything needs to be a native app. Some things SHOULD be native apps…i’m looking at you slack and friends.
I don't follow - can you elaborate?
If developers have to static-link all their libraries to ship a Mac-native app, you're already doing most of the work to ship a cross-platform web runtime like Electron.
Therefore it's not super surprising that successful products like Discord/Slack/Spotify gave up on a good native experience decades ago.
Why do you believe you need to static link to ship a Mac native app?
There’s no such requirement. Tons of Mac apps bundle dylibs within them.
The article clearly states that Apple provides standardized locations for apps to store their dynamically linked libraries.
one of the weirdest and most off-putting parts of macos for me was that dyld isn't aware of that standardized location. a lot of curious oversights the more you pick at it.
What do you mean? I can just tell the linker to link against something in the shared cache and it finds it. It’s been as simple as `-framework <FrameworkName>`
I’ve never had to do extra work to find a system vendored dylib in my decades of supporting cross platform apps.
They probably mean that they don’t like the way the “install name” (as it’s referred to) of a shared library is embedded in the library and then copied to whatever links the library, and is then used to find the library at runtime. I suspect they’d prefer to build the shared library with an arbitrary install name and then just have it found automatically by being in the Frameworks or Libraries directory.
Most platforms don’t have a concept of “install name” distinct from the library name; the value was originally the full path to the deployment location of the library, which was revised to support meta-paths (like `@rpath/LibraryName`) in Mac OS X 10.4 and 10.5 and is what the runtime dynamic loader (dyld) uses to match a library at load time. So an application’s executable can have a set of self-relative run path search paths, which is how it can load libraries from its Frameworks and Libraries directories.
Ah fair enough. But generally an rpath is pretty good to go out of the box.
The primary binary encodes relative to the executable path and any dylib that loads from it should be able to (by default) load relative to that
I mean I had to manually patch the rpaths of macos binaries distributed as an app bundle, because dyld didn't have the relative location of the shared libs in the app bundle in its search path. Not a huge deal, since patching rpaths was also part of the other Unix pipelines, the reasoning was just different. Patching rpaths on other platforms was because we were distributing dependencies in the base directory of the application, which isn't the standard way to do things. On mac, it was because the dynamic linker wasn't aware of app bundle structure for some reason, which is a weird disconnect between an OS standard and a basic system component.
statically linking dependencies is a trivial change in a build script. why are you acting like its some esoteric forbidden art?