“If only they didn't put the power button on the bottom.”
While I think Apple was off the rocker on this particular decision, I do respect their org structure that allows this type of decision to occur. Believe me, there are companies where a dozen people or more would weigh in and prevent an unpopular choice. Consensus sometimes hinders a desired result (both good and bad).
It's a way of signaling how the product should be used. Plug it in, hit the power button, put it down, and never turn it off again. For many users that's probably the only time they will ever interact with that button (or want to).
I actually think it's a really good choice and shows Apple really understands design. And with the relatively low power consumption it makes sense. It's not like it's drawing a ton of power on idle
I have a Mac Mini and can't remember the last time I had to manually press the button. IIRC it even reboots on its own after a power outage.
I think I shut it down once for an extended vacation just to make sure appliances weren't on while I was gone and when I switched apartments. Otherwise I'd check and post my uptime from the command line.
It's a launch M1 mini so I'd wager less times pressing the power button than I have fingers on one hand.
IIRC it even reboots on its own after a power outage.
While macOS presents a "Start up automatically after a power failure" option, and it works, the reality is…complicated.
Intel Macs require a model-specific hardware register to be set after each reboot:
https://web.archive.org/web/20230218203824/http://www.macfre...
The analogous setting on Apple Silicon Macs appears to be nonvolatile:
https://github.com/AsahiLinux/linux/blob/de1c5a8be0ee99602e4...
Apple Silicon devices turn on automatically from IO, even after shut down, so the power button is only useful to: force shutdown if unresponsive or execute some sort of boot key combo to enter a recovery mode.
And if you use bluetooth IO (non-apple). I do on my Mac Mini M2, and yet I have maybe pressed that button 3 times in the year that I am using it as my main machine as I never power it off.
I'd never thought about it before reading this comment but I now realise I don't even know where the power button on my Mac Studio is. I used it once when I first set it up and haven't touched it since.
> I actually think it's a really good choice and shows Apple really understands design. And with the relatively low power consumption it makes sense. It's not like it's drawing a ton of power on idle
I use a Mac Mini (older model) in my music studio. It shares a surge protector with approx. $12k worth of audio gear (some of it nearly impossible to replace). I have all the gear + the surge protector switched off anytime I'm not using it. Which is most of the time.
While the weight and form factor would make powering the M4 Mini on a little more than a nuisance, I have a hard time lumping this into one of Apple's great design features.
M1 and newer Mac Minis automatically power on when plugged in/given power. If you're using an external power switch then that basically becomes the power button.
I'd still like the button to have been on the side or something over looks but it does seem like a pretty reasonable choice overall.
This is a setting in the control center, not sure what is the default though. You can make it auto-boot when external power switch is used, through that setting for sure.
Even if it is rarely used there is no benefit of making it hard to access. There is no harm in having an easy-to-access button that is rarely used.
I guess someone thinks the astetics are worth it, but even if the power button did notably harm astetics (which I doubt) I would take functionality over astetics any day.
If there were two models with different power button placements which one do you think people would buy?
Aesthetics can be very important because attractive things work better:
https://www.researchgate.net/publication/202165712_Emotion_D...
Apple could have found a way to put the button somewhere else and make it nearly invisible, but that's expensive and the Mac mini is clearly designed with cost in mind.
If you want cheap and functional, you're in luck because that's pretty much all anybody makes.
Apple makes it difficult to access because they want to make sure you don't use it often, as they believe the experience of waking up the computer from sleep is better than starting it up.
It's a conscious decision based not on design, but on UX, as with the Magic Mouse USB port.
You can’t possibly bring up Magic Mouse as an example of good UX.
Apple doesn't want you to use it while charging. They succeeded.
Them succeeding at stopping users from doing a normal computing task does not make it good UX.
When a company’s products go from “helping user do what he wants” to “coercing the user to do what company wants” then the company has lost its way.
Zero power draw is still less than a little power draw. A couple million of these babies running on idle is a considerable amount of power. Please, turn off devices when you're not using them.
Any modern computer system uses a lot of power for a few minutes after bootup. If you use the machine a few times per day you're wasting energy (and your own time) by turning it off instead of using sleep mode.
This is actually how I've used my M1 Macbook Pro since I got it. I never fully turn it in. It's either sleeping when plugging into my Thunderbolt 3 dock, or its sleeping on my dining room table on battery power. The efficiency is so good it never dies even if I don't use it for a day.
My work machine is an M3 Macbook Pro. I put it to sleep on a Friday, and after a three-day weekend, it's still ready to go on Tuesday with 95% of battery left.
What's irritating is that a lot of Intel laptops used to be able to get pretty close to this, back when they supported legacy sleep states. I have yet to own a newer Intel laptop that can sleep for more than 24 hours without almost completely draining the battery.
>It's not like it's drawing a ton of power on idle
The power supply connected to the mains for sure does that.
I think it is really bad design. Perhaps necessary because of space restraints and in that case understandable. But that is entirely different to good design and I cannot really buy the "use case explanation".
Many leave their devices on their desk and Apple always had a problem with just letting devices turn of completely, there are regularly problems with it. And they do drain power on idle, which is a frequent complaint.
Yes, we are that insane to use a lot of Apple devices for business in some departments. MDM for phones and iPads is top for the baseline administration, but the devices are eccentric to say the least.
Exactly; it's more of a reset button. You should not need it a lot.
On a related note, the original Macintosh shipped with a physically inaccessible reset button, and the manual cautioned against installing the (bundled) switch that enabled access because "using it the wrong way could cause you to lose information":
https://archive.org/details/1984macintoshmanual/page/131/mod...
> It's not like it's drawing a ton of power on idle
Probably even drawing less than a "normal" PC PSU would just burn to heat in losses, lol. 3 watts of total idle power consumption, that's nuts how low it is...
Your average PC PSU hits up to 95% efficiency, so even at maximum efficiency at full load it would burn like 30 watts.
The quote efficiency on most PSU would be around the half load (more or less). The total system draw does not include the power supply - it will have its own losses esp on low end, still likely in the 80s
After buying one, I actually like it. I know exactly where it is, and can reach for it by feel more easily; I could never tell you whether the power button was on the left or right side of the old Mini/Studio without checking each time.
It's also larger, more satisfying tactile/clicky, and concave compared to the old button (which was rounded into the outside curve, not particularly be satisfying to press). I think the old one being so small and indistinct feeling, and also being so close to the cables meant you would never try to reach for it blindly. You do have to lift it up a bit, but the device is so light you can do that with the same finger you're using to push the button (of course you need another finger to push the top of the mini _down_).
I think neither old nor new button were really meant to be used more than occasionally, since you typically wake your Mac from the keyboard, and both designs reflect that. I do sympathize that the new version could be less flexible in different mounting positions though.
(that said, I'd bet Jobs/Ive Apple would never have shipped this, unless the height underneath was exactly perfect for even the larger fingers to fit)
Jobs and Ive had their head scratchers. Like the Magic Mouse with the bottom charging port, or the Cube with all its cables coming in from the bottom.
The GP's same argument also applies to the Magic Mouse, as it happens:
> It's a way of signaling how the product should be used.
In the Magic Mouse's case, it came out just on the cusp of wireless mice becoming "a thing." Most people, if they were allowed, would have just left the mouse tethered to a computer by its charging cable at all times, since that's what they were used to. But Apple thought you'd be happier once you stopped doing that. So someone (Ive?) decided to make it so that you couldn't charge the Magic Mouse and use it at the same time. This did two things:
1. it forced people to try using the Magic Mouse without any cable connected, so that they would notice the added freedom a wireless mouse affords. It was a "push out of the nest."
2. it made charging annoying and flow-breaking enough that people would put it off as long as possible — which would make people realize that the Magic Mouse's battery lasted for weeks on a charge, and so you really never would need to interrupt your flow to charge; you'd just maybe leave it plugged in to charge when you leave work on a Friday night (and even then, only when it occurs to you), and that'd be it.
---
One could argue that the truly strange thing, is that Apple has never changed this design, 15 years and one revision later. That's an entire human generation! Presumably people these days know that peripherals can be wireless and have long battery life.
But consider: Apple's flagship mousing peripheral — the one shown next to the Magic Keyboard in all product marketing photos — is the Magic Trackpad, not the Magic Mouse. The Magic Trackpad is the first-class option for multitouch interaction with macOS; some more-recent multitouch gestures don't even work on the Magic Mouse. (The Magic Mouse never got "3D touch", for one thing.) In other words, the Magic Mouse is basically a forgotten also-ran at this point — something just there on the wall in the Apple Store for those few people who can't stand the idea of using a desktop computer through a giant trackpad.
Which leads to an interesting question: what is the user-profile for the person who buys (or is bought) a Magic Mouse in 2024?
Well, probably one major user-profile is "your grandpa, a retiree from a publishing company, who's been using the same computer he brought home from work 20 years ago, until it broke last week — that computer being a Power Mac G5 with a Mighty Mouse; and who has never had a laptop, and so never learned to use a trackpad."
And if the Magic Mouse user is your grandpa... then said user probably does still need the cord-cutting lesson that the Magic Mouse "teaches"!
> it made charging annoying and flow-breaking enough that people would put it off as long as possible — which would make people realize that the Magic Mouse's battery lasted for weeks on a charge
At a certain point this just reads like Apple apologia. They made a mouse you can’t use while it’s charging as a means to advertise how long the battery lasts? What?
But it’s not an apology, it’s the right design decision. The battery charges to a usable amount extremely quickly, and if you could plug it in all the time most would, which defeats the point.
> if you could plug it in all the time most would, which defeats the point
The point of a mouse is to be a usable mouse. If folks care enough for it to be wireless then they can use it that way, but if they don't what's actually wrong with using it plugged in? Screams iPhone 4 era "holding it the wrong way". Baffles me why you'd want to provide fewer options for your customer to charge their wireless mouse in order to make them do it the "right way".
If you want to always drive a car with the parking brake on you can — it's your car — but if a driving instructor sees you doing it, they'll give you a demerit. Because you're massively hobbling the car vs. its design space.
> in order to make them do it the "right way".
To be clear, Apple likely didn't want to force people to always use the mouse that way; what they were likely aiming for was a "silent tutorial" — like the Super Mario Bros 1-1 "goombas hurt you, while mushrooms are something you want" thing.
It's just that, in a hardware product, there's no good way to force someone to do something a certain way the first time (in order to teach them), without forcing them to always do it that way.
I’m sorry but this is an absurd comparison. Driving a car with the handbrake on has an adverse effect on the primary purpose of the car. Using a wireless mouse with the wire attached still leaves you with an entirely functional mouse. Using it wirelessly is a preference. It is absurd to defend Apple forcing people to use it without a wire because it will “enforce design purpose”. If they need to do so then it’s the wrong purpose.
Why should Apple care if I want to leave the device plugged in all the time? How does this choice remotely affect them?
Same with this power button: why should Apple care whether or not I power off the device when I’m done using it and turn it back on in the morning? This all just seems like pointless behavior control.
> Why should Apple care if I want to leave the device plugged in all the time? How does this choice remotely affect them?
Because the wireless-ness of the mouse (while also being a macOS-compatible multi-touch surface) was the selling point / feature / Unique Selling Proposition of this mouse vs. other mice (and vs. the previous Apple Mighty Mouse.)
I don't know if you've ever had the opportunity to see many "normal" people's home-office desks, but I have — I worked as a call-out computer repair tech as a teen. And it taught me something: a lot of people have a really small or cluttered "mousing area" — often arranged in such a way that, for a wired mouse, the mouse's wire gets in the way of the mousing surface.
Picture, for example, an old 18"-deep sewing desk up against a wall, on which the user has placed their laptop [effectively permanently, as its battery is long dry]; with a bunch of other things like tiny little speakers and an inkjet printer competing for space on that tiny desk, such that there is only a 8"x8" square of free space to the right of a laptop. The user's mouse is then plugged into a USB-A port of the laptop that's also on the right [mouse cable is too short to plug it in on the left!], with the port being at about the center of the laptop's side. This mouse cable now "wants" to lay directly into the center of that clear 8"x8" square of space; and even if you bend it harshly, there's at least two inches of USB-A plug + cable strain-relief that will still be poking you in the hand.
(Why do they use a mouse at all, if they have a laptop, which presumably has a trackpad? Because trackpads on laptops — especially smaller/older/cheaper ones — can be ridiculously awful [tiny, laggy, insensitive, jumpy, etc], such that this cramped mousing experience is still better than the alternative.)
In such setups, "erasing" the mouse's tether to the computer is not just for aesthetics; it's a genuine ergonomic improvement that makes it "feel" better to use the computer.
And that means that any average cramped-desk person who buys one of these new-fangled wireless mice (or a computer that comes with one) — and actually does use it un-tethered — is going to become not only an advocate for wireless mice, but also likely an advocate of whatever brand of the mouse/computer was, due to the novelty-capture halo effect. (I.e. the "if you only date awful people, you'll become obsessive about the first romantic partner to be decent to you" effect. Decency [or wirelessness] isn't unique; but if you only know it from one place...)
That viral halo-effect-induced word-of-mouth brand-advocacy created by being at the vanguard of the Bluetooth wireless peripheral transition, is the potential upside that Apple saw when creating the Magic Mouse.
And it wouldn't be one they could capture, if they allowed sheer incuriosity to lead that average cramped-desk user to never even try the mouse without the charging cable attached (or, worse yet, if the Macs that shipped with Magic Mice were set up by people who didn't even know the mouse was supposed to be wireless — thinking instead that the mouse was just a wired mouse with a "modular" cable!)
---
Now, admittedly, Apple had many other ways they could have achieved the same goals.
For example, they could have just detected that you're using a Magic Mouse with one of their computers for the first time, and forced you through a little software tutorial that gets you to unplug it — and use it unplugged — for a bit.
I'm guessing they didn't go with that solution for several reasons:
• it goes against the marketing of Macs as being "ready to use for productivity out-of-the-box". Forcing you through a hand-holding tutorial isn't very "ready." (And mark my word, if there was a skip button, even the people most in need of that tutorial — especially those people — would skip it. People don't read manuals on frickin' home CPAP machines, and then die; you think they're reading that?)
• Apple loves thinking of themselves as a design company first and foremost. (Apple products are all stamped "designed in California" — that's what Apple does there, they design things.) And if you know anything about "design" as an academic discipline, you know it's all about figuring out how to shape products or information in ways that cause people to subconsciously/intuitively make certain choices. The core of Information Design is visual hierarchy — "organizing and formatting text to ensure someone glancing at a poster gets the most critical information before glancing away." The core of Industrial Design is the concept of affordances — "putting push-plates on the push side of a door and pull-bars on the pull side." Apple doesn't want to stop you and tell you how to use their stuff; Apple thinks they are clever enough to design their products such that they afford being used in exactly the intended way. And when the product's design "fights back" from having a positive affordance to idiomatic usage... they just design more forcibly, actively de-affordancing non-idiomatic usage.
• A tutorial that pops up on Macs doesn't help someone who wandered into an Apple Store; bought a Magic Mouse (a perfect "this store is too expensive for me, but I want to buy something" purchase in an Apple Store ca. 2009); went home, and promptly plugged it into... their Windows PC. Yes, people really do sometimes buy Mac peripherals and expect them to upgrade their Windows-using experience, not realizing that Windows doesn't have the particular set of multitouch gestures mentioned on the back of the box (especially not back in 2009.) The "hardware tutorial", meanwhile, is platform-neutral.
Thanks for the really really long reply... You've exhaustively gone over the selling points for a wireless mouse and why people would want and buy one. I don't think any of it is in question. It's great to have a mouse that can work without a cord connected. I bought a wireless mouse (not Apple's) because I agree with you about the selling points. What I don't get is why not also allow it to be used plugged in, if the user wants to, assuming it costs about the same to put the charging port on the front, and can be done without compromising the industrial design? Why deliberately make it useless while plugged in?
If you’re using it for gaming, it would be preferable to leave it plugged in to avoid the danger of the battery running empty.
Why not let the users use it as they see fit?
Please explain how it's in my best interest that I must use my peripherals wirelessly. The only wireless mouse I have ever owned is in my work bag, so I have one wherever I go, it's not for regular use and I have zero problems with mouse cables, for the actual 30th year this December.
Because it’s the design of the product? Every product is designed with a specific usage in mind. This is designed to be wireless, hence all of the ways in which it enforces and enables that. The battery lasts a very long time, so even in your work bag it should be fine (although are you then plugging into many different computers to associate it?)
If you want a corded mouse (and it sounds like that’s a better fit), there are plenty of options on the market.
The port on the bottom is really the least offensive element of the design. I know people find it fun to clown on, but if any of them had ever used one for 5 minutes they would realize it's a terrible mouse for a bunch of other more important reasons (weight, feet quality, tracking accuracy, polling rate etc.).
I use it at work exclusively. I love it due to the gesture controls and build quality.
The worse part about the Magic Mouse is just how small it is. It's uncomfortably small. The Magic Trackpad however is a great.
Yeah I hate that people go for the easy fodder, which barely effects real world use, and ignore the multiple actual issues with it that would make it quite poor even if the charging port was fixed.
This is a really interesting view, and I have to admit this actually makes sense. Wireless mice definitely are nicer to use, and you can usually make them charge fast enough that a five minute charge while you take a short break is enough to get you through the day to a proper charge.
I must admit, in light of that logic I can totally buy placing the charge port like that solely to force users to use the mouse correctly.
I was thinking their version might be making the Apple logo itself t beinghe [overly] touch sensitive power button, like the Cube.
I mind this design decision a hell of a lot less than the baffling deliberate decision to map EVERY KEYBOARD BUTTON to be equivalent to the power button (which the damn thing already has) on their laptop line-up.
I love it when my macbook is turned off and I accidentally nudge a single letter on the keyboard and it powers back on - not to mention when you're drying to clean it with a micro fiber cloth.
For better or worse, I have a habit of clicking the touchpad or a few keys after I shut down my laptop. Just to make sure it's shut down properly. Back in Windows days with HDDs and hibernate, laptops sometimes took minutes to shut down completely, and I don't like closing the lid before shut down is complete.
Now, I end up restarting with that mere act, and have to long-press to shut down again because the shut down option won't show up on login screen.
> shut down option won't show up on login screen.
It does. At least I can see it on my personal MBA and work MBP. To right corner
Just lock the screen and you’re good to go for keyboard cleaning.
With a locked screen, key presses go to the password field. I have twice caused my user account to become disabled due to too many password attempts while cleaning my keyboard.
A pro tip from a Mac sysadmin who gets to clean a lot of filthy laptops: https://folivora.ai/keyboardcleantool
What's the equivalent of "locking the screen" for cleaning an Apple TV's paired remote?
Tell the Apple TV to restart and you have 15 seconds to clean the remote.
If you fully shut down a mac laptop, you have to press and hold the power button to turn it back on. Not sure what you’re talking about here and probably why you’re getting downvoted.
I don't know about every macbook, but I just tried this twice on my 2019 macbook pro and pressing any key on the keyboard (or at least the 2 keys I tried, "f" and "8") will power it on when it is powered off (yes, fully shut down, not asleep). Based on some quick googling, this still appears to be the case for M-series macbooks.
Since the introduction of M1 series, there is no permanent shutdown. Even after full shutdown, any keypress on keyboard will power up the system.
My M1 Macbook Pro turns on (after being fully shut down) by keypress.
That’s a pretty wild definition of “fully shut down” that manufacturers (not just Apple) are pushing. When my device is shut down, I expect it to be fully de-energized and drawing zero current. How can a keyboard action re-apply power if the button itself is not completing the power circuit?
This is one of the reasons I’ve started putting all of my devices on power strips with physical switches that de-energize the AC mains. You can’t even trust devices to power off when they say they are off.
The number of devices in your home that draw current when they are “off” is too damn high.
As a counter point, that same org structure resulted in the removal of the 3.5mm audio port. Ugh.
All Mac devices still come with a headphone jack - and they are even good for higher impedance headphones (I use a 32ohms DT770 on my Macbook/MacMini).
For mobile devices, removing the headphone jack was not well received and it annoyed me too when it happened. Last year I made the switch to airpod pros, and I think I was the last person on earth to switch to BT for headphones - never looking back. So much better not to have a cable and untangled it.
I value flexibility in a product more than just about anything else. I will quite often choose the product with more features and use cases, even if it means paying a little extra money, just to have the _option_ to use a particular feature, even if I'm quite sure I won't use it on a daily basis.
Probably to gear up towards their bluetooth airpod series, which doesn’t need that port.
The wired ones are decent yet cheap, but if they did not remove the 3.5mm, then except teens and hip adults, most people would opt for the wired ones.
My M1 Max Pro again has the 3.5mm port, and I have bought a pair of wired ones and gifted my airpods to my teen nephew.
and improving water resistance.
I have a lot of devices where I frequently accidentally hit the power / sleep button.
Keep it on the bottom where it's hard to hit accidentally.
You don’t press it very often and this makes it harder to press accidentally (eg putting stuff on top of the computer or a curious cat). I very rarely use the power button on a computer but maybe we behave differently.
That's what I was going to say. Do people still use these ? Given the low power and the general stability (I often have 150-300 days of uptime on my macbook m1) why not just put it to sleep and wake it up with the keyboard/mouse ? I can't even remember the last time I actually rebooted my desktop, maybe last year and I'm not even sure
I'm the first to shit on apple but this sounds like a complete non issue
Just flip it upside down
Or just use it in Australia. Or Antarctica. Etc.
The Mac mini M4 performance is around 4-5x in DaVinci Resolve for me - compared to my HP laptop (i5-1135G7).
Rendering HDR video was around 12fps there on the i5 - the same project in the Mac mini gets 60fps.
The M4 10 core GPU seems on par or better with a mobile RTX3060(65W) for video tests (NR / Deflicker) so I'm also impressed about the M4's efficiency. A lot of power per Watt.
It's becoming a dedicated video rendering machine for me where all the SMB auto mounting issues with macOS seem solvable. Pretty happy so far with the base model price even in the EU. The power button placement is an annoyance for me, though.
When do you turn it off? I have a Mac M1 Studio and I just let it sleep. If things get weird I reboot. I think I recall using the power button about a year ago after returning from vacation after I had shut it down.
Right now I mount up to 7 HDDs to the Mac via SMB, have some Streamdeck / Pedal and the necessary external SSDs for fast storage connected. I will see if the SMB mounts come back OK after sleep (my laptop acts as server) but the Streamdeck and HDDs wake up randomly so overall it's easier to switch everything on and off depending on usage.
Stop underwear off complaining about the mini, you should complain at streamdeck
Like seriously WTF are people turning it off its 3 watts at idle lol, most power supplies have that much phantom drain lol
Everyone keeps citing idle, which is when the device is on and active but not particularly doing anything.
The standby power draw is 1W or less. I've used Mac Minis for years -- just replaced my M1 with an M4, though the M1 left me wanting for nothing -- and the number of times I've interacted with the power button is so negligible I imagine I've gone over a year without touching it. When I haven't touched it in a while it goes to standby, waking instantly when I engage it again.
Not everyone lives the same way. I am seriously considering a Mac Mini as my next upgrade yet I live in a RV and move frequently. Are there ways that I can keep the Mac mini powered while traveling.. sure, but why would/should I?
Are you not turning off entire circuits to reduce power draw when mobile? I’m actually thinking about one of these for my truck camper and its power draw seems fine, but the stumbling point for me is the additional power draw from the monitor it would require. I think I’m leaning toward an M4 MBP with nano textured screen for maximum power efficiency and ability to work outside when it’s nice, though I have not yet put much effort into researching efficient monitors
My EU mind is blown by these claims. Let’s take the lowest(1W) at sleep mode. With a thousand mac minis at sleep mode, that is already 1kW! In my country, a single person household’s yearly electricity package comes at 1400kW(+100 depending on provider) per year.
Note: intentionally keeping it simple, please don’t nitpick.
No household uses 1400kW, and kW/year doesn't make sense. Do you mean 1400kWh/year? That seems pretty low (NZ is 7000kWh/year), but if so, you're comparing power to energy, which doesn't mean much. 1W 24/7 < 9kWh/year, which is pretty small.
Personal guess from a fellow European citizen: I think they meant to say 1700 kWh/year. According to most German power utilities, the average 2-person household consumes about 2400 kWh/year.
It’s not clear what your point is because you’ve made a strong argument for it being negligible.
i5-1135G7 is from about 4 years ago. If you look at the latest offerings from Intel/AMD the gap should be fairly small.
> The power button placement is an annoyance for me, though.
Keep the thing upside down.
Not joking.
Feel like that risks overheating...
Not really. Unlike previous mac mini models, the grille on the underside is both the air intake and exhaust. If anything it ought to be better upside-down, since convection now helps the heated air rise out.
Edit: any downvoters care to elaborate?
It would make it fill up with dust, even while idle. Also, and fluid spill would be much more likely to cause damage.
The normal orientation is fine for most people who want things to be as simple as possible (ie, most Mac users). There is very little reason to ever turn it off. If you still do that frequently for some reason, just leave it in a locatino where the button is still easy to access.
The case is also going to radiate heat and turning it upside down will make that less efficient. The base won't radiate heat in the same due to being plastic and due to it already being used to pass air through.
The case isn't thermally connected to the SoC in any direct way, so while I'm sure it does radiate some heat, I think it's pretty negligible. The PSU sits at the "top" (in regular orientation), and I don't think it runs much of a risk of overheating.
> Keep the thing upside down.
Or maybe on its side? :)
Someone has already released a design you can 3d print that mounts the Mini on its side and makes it resemble the cheese grater Mac Pro.
> 3D-printed Mac Mini enclosure makes the tiny PC look like the world's cutest Mac Pro
https://www.tomshardware.com/desktops/3d-printed-mac-mini-en...
Yeah it seems like this would also increase airflow. Is there any possible issue with just having it upside down all the time?
> Is there any possible issue with just having it upside down all the time?
You might scratch the top, maybe? Nothing that a small piece of cloth wouldn't prevent, anyway.
Isn't the case the heat sink? It might dissipate heat worst when lying upside down, as the table would get hot.
Have you tried 8K video? How does each machine handle that?
I just arranged a selection of 4K H.264/H.265 clips in a 2x2 grid on a 8K timeline in DaVinci Resolve.
Playback works well - up to 60fps. However, export to H.265 creates a lot of Swap. Rendering went with 15-18fps. All videos on a SMB network drive but the GPU was the bottleneck for rendering.
Swap was even around 24GB with 5 videos which I tested first. Using 4x4K it went 9 GB before stabilizing at around 2GB. No effects or grading whatsoever - plain 4K60 SDR videos.
One single SDR 4K clip renders to 8K at 25fps. Using Superscale 2x makes that 0.5-1fps.
For 8K rendering you may be better off with 32GB RAM minimum or trying the M4 Pro model maybe with 24GB. For 4K/6K editing the base 16/256 M4 Mac mini seems sufficient when all video storage will be on external drives or network.
Edit: added single 4k->8K rendering performance
>The Mac mini M4 performance is around 4-5x in DaVinci Resolve for me - compared to my HP laptop (i5-1135G7).
You could pick a variety of non Apple CPUs that easily deliver 4-5x the performance of an 11th gen i5. Maybe don't be disingenous and compare the M4 to a more recent CPU like i5-14600K, which is also 4x the performance. I'm not comparing on power efficiency, since that was not mentioned at all as part of your comment.
Is it 4x the performance?
Passmark shows 38,951 / 4,282 versus 24,724 / 4,555:
https://www.cpubenchmark.net/cpu.php?cpu=Intel+Core+i5-14600...
https://www.cpubenchmark.net/cpu.php?cpu=Apple+M4+10+Core&id...
So i5-14600K is 1.57x on multi-core, slightly worse on single-core. $235 for the CPU versus $599 for a whole system. Could maybe match the total price, but Intel won't be able to come anywhere close on the power efficiency.
The 120GB/s memory bandwidth of the unified memory helps especially with video, I guess. The M4 CPU isn't really stressed out most of the time. Only multicam and HLG conversations it maxes out.
Once I patched my old Dell T1650 BIOS for ReBAR support yet the iGPU of the i5 1135-g7 had similar GPU performance for video as the Intel Arc A380 in the desktop PC. The old PCIe3 speed limited its performance. I heard others reported a smoother replay experience with Apple silicon compared to even a RTX4090.
I get some delays when fast scrubbing through a 9 multicam 720p timeline and just 360p proxies. Still impressed compared to what I was used to. Video editors may be surprised about the performance for the price.
How did you patch the Dell T1650 BIOS for ReBAR support?
There is a Github project [1] which has detailed instructions. The ancient i5-3570 only allowed 2GB ReBAR, BTW. GPU-Z says ReBAR / 4G is activated and working, Intel Arc Driver does not see it but seems to use it. Some part of the BIOS had to be manually fixed, AFAIR.
The PC was given for free, the CPU €11 yet overall I wouldn't recommend the process just for the result. It's only little benefit, if at all, though fun. On that occasion I also added some NVMe driver which works well, demonstrated for the similar Dell Optiplex [2][3]..
[1] https://github.com/xcuri0/rebaruefi
[2] https://www.tachytelic.net/2021/12/dell-optiplex-7010-pcie-n...
[3] https://github.com/jrdoughty/Dell-7010-rebar-guide
Edit: some wording
4x vs. the old i5, not the M4. They are trying to say that comparing to a CPU released four years ago is pointless because the newer CPU is obviously much better.
It's not disingenuous to do a real-world comparison to a system you already own when stating the specs. It's actually much more useful to hear these real world anecdotes than to look at geekbench numbers.
Thank you!
I expected a fast M4 package but still was mind blown to see the video editing performance. After all these video renders run for many hours.
My 2 year old i5 laptop - even with 64GB RAM and 2x2TB SSDs upgrades - was around the same price like the base M4 Mac mini / uses similar Power. The PC surely is way more versatile with these specs and expandability.
Staying mostly in X86-land due to affordable RAM & storage, nothing I currently have comes close to the M4 performance per Watt - and now even performance per $/€ - in my video-editing use case.
It's comparing apples to oranges. If you want to compare computers, compare a macbook with an old i5 to your laptop with an old i5. Comparing an M4 to an old i5 is just silly. Of course it's going to be faster.
It's a comparison of two CPU/iGPU combos I have on my desk with similar power draw. Those iGPUs are most power efficient for video editing as I like QuickSync from Intel.
The i5-1135G7 (17W TDP) has 2 Media Engines which I use for proxy generation in parallel for example and pretty versatile so I use it daily (64GB RAM..).
Still, I think it's a notable achievement to get 4x performance with the M4 for video at similar wattage of the i5. I don't have an M4 MacBook but I guess the M4 would perform similar to the one in the Mac mini.
M4 Mac Mini with 16GB RAM is doing a "good enough" job of editing 6k raw footage in Premiere for my team. I'm surprised to say I'm content with the 16GB of ram so far.
Edit: This is in contrast to my M1 Macbook Air with 16GB of ram which would stutter a lot during color grading. So definitely feeling the improvement.
Macbooks Air thermal throttle, thats why the Air with 16gb is an issue and the mini isn't, no fans = throttling at heavy loads, its not a ram issue.
I bought the first MacBook Air M1 with 8GB because it was the only option available in my area. Initially, I had doubts, especially after using notebooks with more than 16GB of RAM in previous years. But I was genuinely surprised by how well the M1 performed. My takeaway is that there’s a lot of room for similar improvements in Linux!
I have an m1 as well.
And while I'm broadly satisfied with its performance, I do think that the SSD is probably carrying some of that load. And for a machine that often gets used far longer than a PC, I can't see that being great for longevity.
> And while I'm broadly satisfied with its performance, I do think that the SSD is probably carrying some of that load. And for a machine that often gets used far longer than a PC, I can't see that being great for longevity.
This isn't the early 2010s anymore - SSDs last "long enough" for most people, to the point they are no more consumable than your motherboard or your RAM. (I've actually experienced more RAM failures than SSD failures, but that's an individual opinion here.)
And for the downvoters - do you remember the last time you handed in your Steam Deck, Nintendo Switch, iPhone, or even laptop specifically for a random SSD failure, unrelated to water damage or another external cause? Me neither.
people really don't grasp that the slowest SSDs are still 3-5x faster than the fastest HDDs (including SAS drives. Yes, the dualport kind).
And, Looking at the anandtech review of a vertex 3 way back in 2011...
I'm still very happy with my 8GB Air M1 as well. It's incredible how well it still works for a 4 year old entry level laptop. I see all these new M's come out, and I'm sure they're fantastic, but I'm not at all tempted to upgrade.
Yeah, I don’t know why 8gb base models get so much hate online. 8gb is 64 billion bits of memory. If you’re writing everyday software and you need more memory than that, you’re almost certainly doing something wrong.
Seriously?
What if you want to have a few browser tabs and a spreadsheet open? Or containers?
My M1 routinely rests around 22gb of RAM.
How much RAM should a few browser tabs and a spreadsheet use? Spreadsheets and webpages were both invented at a time when computers had orders of magnitude less ram than they do today. And yet, Excel and Netscape navigator still worked fine. It seems to me that bigger computers have caused chrome to use more memory.
If 16gb is considered to be a "bare minimum" for RAM, well, how much ram will all those programs use next year? Or in 10 years?
That doesn't help you right now, but 22gb is ridiculous for a few browser tabs and a spreadsheet.
It’s not just for tabs and spreadsheets, I also have an ide, containers, etc.
I do think the memory footprint of many applications has gotten out of hand, but I am more than willing to spend the extra money not to have to think about it.
> If 16gb is considered to be a "bare minimum" for RAM, well, how much ram will all those programs use next year? Or in 10 years?
16gb is the figure for the next 10 years. If you see yourself being content with 8gb of memory shared between your CPU and GPU in 2030, you must have a uniquely passive use-case.
I remember when people said 4gb doesn't need to be the minimum for all Macbooks. Eventually MacOS started consuming 4gb of memory with nothing open. Give Apple a few years to be insecure about the whole AI thing and they'll prove to you why they bumped the minimum spec. Trust me.
I also use an 8GB M1. It has firefox with many tabs & windows open in OSX and also a Linux VM in UTM which is running VSCode, vite, and another firefox with lots of tabs. It's performing well! (although swap is currently at 2.3GB, and there's a further 3.5GB of compressed data in RAM)
Using Chrome I'd guess?
I wish the same could be said of the Studio Display, which is quite power hungry. If the Mac is running then the display is using minimum 10 Watts of continuous power usage at all times, fan running, with the screen off.
I guess it takes 10 Watts to maintain the Thunderbolt controller, USB hub, A13 processor, and run the fan.
Power usage does drop to <1 Watt when the Mac is actually sleeping, unless anything is plugged into the USB hub. Even an empty iPhone cable will cause the display to draw 5 Watts. It's disappointing.
Agreed. I’m very impressed with it as a general productivity monitor, but it’s a power hungry monitor. Kind of like an SUV lol.
The built in mics and speakers are fantastic, 5k is great, the webcam is meh.
Also interesting, the M4 Mini has the flash storage on a replaceable module, instead of being soldered to the motherboard, although the NVMe controller is still integrated into the SoC.
iFixIt and others have already posted videos showing that the flash storage is now upgradable.
> M4 Mac mini Teardown - UPGRADABLE SSD, Powerful, and TINY
"Upgradeable" is too big of a word here, specially considering that they're using different form-factors even between the models released on the same year (e.g. pro vs non-pro) ; and also different from models released on the previous year (e.g. studio). This almost certainly means that next year's model will also use a different interface, so you won't be able to upgrade your storage at all.
You might be able to. You just need to make sure you get a compatible module somehow.
I wonder if 3rd parties will start selling them. If the memory controller is in the cpu, there’s no reason for the little board housing the ssd to have any proprietary chips…
If you can get a 2TB chip for ~50% of what it retails at Apple the M4 mini would be the absolutely 100% totally best computer ever made.
At the moment it's only the best computer ever made, but too expensive if you want any sort of storage.
Depending on your level of price sensitivity, you can always use a Thunderbolt SSD, an external RAID array of SSDs, or just get the 10 Gigabit Ethernet upgrade and hook into local NAS.
The M4 Pro Minis support higher capacity modules, so it's not too shocking that they are not identical.
We've already seen videos from the usual suspects showing that people who are sufficiently skilled with a soldering iron can replace the flash chips in the modules with higher capacity chips, in addition to replacing the whole module.
> We've already seen videos from the usual suspects showing that people who are sufficiently skilled with a soldering iron can replace the flash chips
Sure, but I don't think soldering skills is exactly what I'd have in mind when I think "upgradeable".
As noted above, you can simply replace that module with a higher capacity module with just a screwdriver, as iFixIt did.
However, there is a real opportunity for those who do have soldering skills to make a quick buck here.
You could pretty easily buy the cheap base model M4 and resell it as a custom upgrade build, as long as you were clear that the SSD was no longer stock.
The problem is getting the module...
It's not an easy solder job and they are picky about what NANDs they work with and how they're configured. It's better than soldered to the board but not by much.
It comes free with the base model.
Given the price Apple charges for the high end upgrades, it's worth your time to buy the base model, do the uograde, and sell the upgraded unit.
For instance, here's a Mac Studio being upgraded to 8 TB.
Sorry, but again this is an abuse of the word upgradeable.
You could do the same with many other laptops and even some phones out there - buy from a 3rd party who has resoldered the corresponding parts. The replaceable modules brought you nothing.
Even the economic motive you mention is actually just because Apple overcharges for storage, and has nothing to do with the replaceable modules. As long as there is no cheap 3rd party source of these, a replacement module ecosystem makes no economic sense (someone will always have to bring in his device for resoldering, or you lose the price of one good working base model).
Someone actually doing the upgrade: https://youtu.be/cJPXLE9uPr8?si=IyTLwG9SC4r4dJXP
The controller being decoupled is an extremely interesting idea! Makes a lot of sense. I wonder if it includes a (nontrivial) cache.
Apple bought a company that designed enterprise SSD controllers over a decade ago.
> Anobit appears to be applying a lot of signal processing techniques in addition to ECC to address the issue of NAND reliability and data retention. In its patents there are mentions of periodically refreshing cells whose voltages may have drifted, exploiting some of the behaviors of adjacent cells and generally trying to deal with the things that happen to NAND once it's been worn considerably.
https://www.anandtech.com/show/5258/apple-acquires-anobit-br...
Seems likely its a cacheless (well, host bus memory) design like the ones used by Apple on their other designs.
I think the reason to make it replace/removable is to reduce e-waste at EOL. Lots of companies have policies on data storage on decomissioned computers to be physically destroyed, so making it replaceable allows the machines to be repurposed after.
"Upgradable" if you are willing to desolder and replace the BGA chips.
You can't just swap the module out.
The iFixIt video above literally shows them swapping the module out.
The storage is not soldered to the M4 mini SOC.
M4 Macs only accept SSDs of the exact same make and model as the one installed at the factory.
So to expand storage in an M4 Mac, you indeed need to desolder the flash chips and solder in new ones.
The video at the base of this thread has iFixIt take the 500GB SSD from one Mac Mini and swap it with the 250GB SSD from another, and both recognized and worked with the replacement.
It is a swappable part. Which means much more attainable servicing for flash failure or exhaustion, and possibly even upgrading storage in the future.
What stops someone from selling replacement cards?
not wanting extensive interactions with Apple's legal team. Charging $400 for $10 of storage means they have a lot of money to harass you with very well payed lawyers even if you are in the right.
There have been companies selling hardware upgrades for Macs that weren't designed to be upgraded for many decades now.
No, it is soldered to the storage module. You have to desolder the flash chips from that module and replace them. You can't just order a bigger storage module from Apple (or anyone else) and plug it in.
I don’t think anything is stopping you buying a second hand / 3rd party module online. It just needs to be physically compatible with your particular generation of hardware.
I don't think this is true. If you watch the videos, dosdude1 specifically says he had to order blank NANDs for this process. Then you DFU restore the system from another mac. I have no proof, but I assume part of this DFU restore process is the new NAND chips being hardware paired in some way.
Again I have no proof, but there must be reasons he claims they have to be blank NANDs
> there must be reasons he claims they have to be blank NANDs
If you are building new SSDs, it makes sense to be certain that used NAND chips aren't slipping into your supply chain.
They do have a limited number of write cycles that they are good for, after all.
Watch the video again. He emphatically says they have to be blank or it won't work
Well, someone has to build the 3rd party module
Apple's ARM chips have been incredibly effecient for a while now.
My m3 air draws around 3W on avarage and that's with a 14 inch screen running at around 40% brightness.. impressive stuff. Passive cooling too!
M3 airs come in 14” or did you mean m3 pro?
To best of my knowledge, airs come in 13” and 15”.
I can’t help but wish that Apple would provide the handful of features needed to make a Mac mini into a competent home server.
Maybe they already have, depending on what you need. Settings >> General >> Sharing provides lots of options. "Remote Login" is SSH and SFTP, and last time I used it, "File Sharing" was SMB. "Screen Sharing" and "Remote Management" seem useful, too. I assume that "Media Sharing" is supposed to allow iTunes on your network to see media files, although I've never used it and the information on the dialog is limited.
Can you run it headless? Like, if I have it power on after a reboot, is it possible to log in remotely?
Yes, but getting it to work requires that you both:
(a) disable FileVault, and (b) enable automatic login
One option is to automatically log in to an account which has very little access, and have everything sensitive on an encrypted disk/partition, and to use a separate keychain for any credentials you want to protect.
This and more suggestions here from a company using Macs as build servers: https://forums.developer.apple.com/forums/thread/737381?answ...
I don't like the idea of enabling automatic login on any machine, so I keep FileVault on and just accept that any rebooted Macs will need physical access on restart.
If it's possible somehow to get screen-sharing access (or even SSH) without automatic login after a reboot, I'm sure lots of users would love to know how.
My old mini ran just fine only accessing it over remote desktop. I assume the new ones will act the same.
Yes it supports remote admin via terminal or virtual desktop. I currently use mine like this for transcoding while freeing up my macbook.
With ssh, most certainly.
Also, if you enable desktop sharing via VNC, it will also work at the GUI login screen.
You can enable SSH to do this
i’ve found it pretty easy to run my “homelab” with docker compose. Traefik binds to port 80 and 443, and all my apps are accessible behind the proxy.
Docker desktop can be configured to start on login. For keeping the mac awake “forever”, i’d suggest the Amphetamine app.
I also appreciate that you can easily use the macOS screen sharing app to login and manage the mac from a laptop.
That certainly works, but Docker will use a Linux VM, right?
yes, unfortunately. Doesn’t really matter for the web apps i’m hosting, but I could see it being an issue for certain apps/workloads!
I suppose you could try running Asashi Linux in the future? I think it only works on m1 and m2 mac’s at the moment but don’t quote me on that.
Orbstack might improve energy consumption for your setup, they are applying some clever hacks.
Yes unfortunately there’s no Linux that runs on the metal on an M4 yet.
Yes, docker desktop uses a Linux VM to run the containers.
What specific feature gaps would you like to see them address?
Linux support. MacOS is a desktop first gui based operating system. Linux on the other hand is a server first cli/terminal based operating system. Everything server related is designed to on linux first and foremost and may or may not incidentally also run on MacOS.
If they work on a BSD they should work okay on macOS. (Not because macOS is exactly like FreeBSD, just that it means the project has been tested cross-platform.)
run it in a VM.
macOS is explicitly designed to not be a server, and the consumer hardware it runs on is also designed that way. Apple even discontinued the Server tools that you could buy on the App Store that used to be called Mac OS X Server.
If you want to run Linux server apps, you should run Linux. Because Apple hardware and macOS isn't giving you any advantages over a generic piece of hardware running a Linux distribution. The hardware costs more and is less upgradable than off-the-shelf hardware.
Servers should not run desktop environments because they are a waste of resources and widen the attack surface due to having more components installed and running.
And even if you want a desktop environment for your Linux server, Linux most certainly has a wide selection of mature stable desktop environments.
If you need to do development work or just achieve the goal of running Linux applications on a Mac, that can be easily done via virtual machines, containers, etc.
I'd like to replace my NAS using a mini - but Apple segment the market on disk.
A "dumb" NAS 2.5" SSD drive array plugged into one via ~~firewire~~, and then out to the network via the Mac Mini would work.
edit: thunderbolt!
FireWire?
Once I have some more disposable income I plan to buy a Thunderbolt RAID array and a mini. FireWire hasn’t been on Macs for at least a decade.
Apple’s internal storage pricing is absurd but you wouldn’t plan to use a NUC or a Raspberry Pi SOC’s onboard storage for a NAS anyways.
hah! meant thunderbolt :)
> Thunderbolt RAID array
this is interesting:
Official Linux support would help, is anyone running MacOS on a server?
This isn't the market for MacMinis though. Why are people on this forum so bad at understanding market segmentation? Apple made an incredible desktop machine that happens to work pretty damn well as a server if you poke around.
This machine is for people at home to for editing video. It's great in the field for production where it goes from pelican case to hotel desk to folding table to pelican case to cargo hold to storage.
Have you ever thought that maybe people understand "market segmentation", but at the same time, they'd like to know how broad a range of computing options one would have on these general purpose computers, with price tags in the many-hundreds to thousands range?
Sure, but to complain that a Mac, which, come on, at this point is a known quantity for 20 years, doesn't run Linux is just looking to complain. If you want more options there's endless x86 choices, and if you want ARM then demand better from other manufacturers as well. Apple showed its possible, why doesn't Dell come out with something comparable? I'm not a fanboy, I run systems of all stripes, but Macs aren't designed to be servers (even though they operate perfectly well as one) and people need to stop complaining that they aren't.
Oh, and there's Asahi, which does run on Macs (not the M4 yet, but it'll come).
In the past, Apple sold at least four generations of the Mac mini that included models literally branded as server models. Continued interest in using more recent models as servers is quite reasonable.
If running native ports of server software isn't your cup of tea, you can run Linux containers on macos.
In the full GUI MacOS install? And the Linux container (I’m assuming you mean container like docker or podman?) would run in a Linux VM?
I run full multiple Ubuntu desktop VMs on Parallels on a M1 MacBook Air. You can use Docker for server installs, sure, but QEMU also works great on Macs and with Rosetta you can even get pretty damn close to native x86 execution speeds.
they run through virtualization which is clunky to interface with across boundaries and introduces overhead. I also don't think it has any hardware acceleration for things that would benefit from using the gpu.
I have a Mac mini backed by an Areca 24-bay Thunderbolt disk-array in the rack in the garage. Works like a dream.
Sorry not knowledgeable about this but do you use this as NAS? What software do you use?
MacOS has built-in file sharing via SMB. It also has built-in VNC for graphically administering the server, built-in ssh/sftp, built-in rsync for backup, etc. etc.
Basically I just use the OS.
I see thank you. I dream to setup something like iCloud but with open source software and hosted at home :) Not sure if there is anything like that out there.
nextcloud.com comes to mind.
Is that something that can be setup on Mac OS? Or do I need to install Linux on Mac Mini?
I’ve never tried turning a Mac into a home server. What features do you need that it’s missing?
Depends what 'home server' means.
MacOS would need syncookies to be a viable tcp server on public IPs, IMHO, but MacOS pulled FreeBSD's TCP stack a couple months before syncookies were added, and they never rebased or otherwise added syncookies later.
I haven't looked into if they pulled any scalability updates over the years, but I kind of assume they haven't, and the stack would have a lot of lock contention if you had more than say 10,000 tcp sockets.
Given that, if I were Apple compatible, I might run a mini as a LAN server, but my home servers provide services for the LAN as well as some public services (of limited value and usefulness, but still public).
The network stack is very different now - BSD doesn't run on cell phones after all. But no syncookies, no.
But IMO the real advantage of ARMv8 for a server is that it has better security.
I don't really think ARMv8 has anything useful to provide here?
It's all in there, it's just optional.
Plus it doesn't have variable length instructions.
Perhaps they would be good execution nodes if not good endpoints.
Is this something that you can fix by putting the server behind Cloudflare? I assume most "home server" users would do that (or a similar service provided by Apple if they go down that route).
Well, Cloudflare is kind of spendy if you want them to proxy non-http traffic.
If you put a proxy in front, and you're careful to only allow inbound connections from the proxy, you should be ok though.
Funny, I've been using Mac Minis as servers for more than a decade.
What would those be?
Such as?
what else do you need?
What I look for is, 128GB RAM minimum, decent number of PCIe lanes because I want two fast NVMe drives, a HBA card ( though this I guess could be external ), two network ports minimum, ZFS, sane terminal, native support for containers and VMs. Native support for UPS interfacing, native support for backup of containers and VMs. And lastly a community of other users doing the same.
Dual power supplies is a nice to have.
So you are sad that the Mac mini isn't a bog standard HPE/Dell/etc Server?
Boy will you be pissed when you find out about laptops, phones, game consoles and basically every other compute form factor.
Might want to look into rackable Mac Pros though!
Kinda unrelated, but with all this amazing efficiency, I wish Apple would re-introduce the one feature that made me truely love my old MacBook.
On my ~2010 Macbook Pro they had a series of small green lights on the chassis that acted as a battery indicator. When the laptop went to sleep, it would take on a slow breathing like animation effect. It was beautifully done. I was sad when it was removed.
Please bring this back.
The m4 has 6 efficiency cores and 4 performance cores. This is 2 efficiency cores more than the previous generations, and the same number of p-cores, thus higher e to p core ratio, which can explain a large part of the increase in power efficiency. Not to say that otherwise there is nothing remarkable here, of course there is, but if the author found a 30% increase in efficiency compared to m2 while they claim they expected 4-10% after 2 generations of chips, it could be because of that.
The m4 pro has 4 e-cores and 6-8 p-cores, hence I would not expect similar increase.
IIRC as well, the M3 chips had an odd efficient to performance core ratio that’s not present in the M4, which is slightly interesting
That would explain improving performance, but I don't actually understand how that would improve efficiency. Particularly at the high end where they quote 6.74 Gflops/W .
If a higher proportion of the total performance is coming from efficiency cores, it is reasonable to expect overall efficiency to be improved.
Exactly this ^
In particular, under some ideal, unrealistic assumptions to simplify things, and denoting n the number of E-cores, E the efficiency and W_e the power consumption of each e-core, while respectively denoting m the number of each P-cores, P the efficiency and W_p the power consumption of each p-core, where E>P, we can calculate the express EFF of the cpu as
EFF = (n*E*W_e + m*P*W_p)/(n*W_e + m*W_p)
= (n*E*W_e - n*P*W_e + n*P*W_e + m*P*W_p)/(n*W_e + m*W_p)
= (E-P)*n*W_e/(n*W_e + m*W_p) + P
= (E-P)/(1 + (m/n)*(W_p/W_e)) + P
which shows that the energy efficiency of the CPU increases monotonically as a function of the ratio of e to p cores n/m.It is amazing to see drop in idle power consumption over the years for Mac mini: https://support.apple.com/en-us/103253
32 W for the PowerPC to 5w to Apple Arm ones (Current).
The Studio being the GPU-centric model, I can't wait for all these M4 Mini performance improvements to make to the Studio line.
Is there any benchmarks for these chips doing like regular 'data-sciency' CPU grunt work? Dataframe-wrangling, inverting matrices, doing large matrix, factorisations, fitting decision tree's, etc?
I'm very keen on one of these, but I simply have no idea how good they are at my day to day tasks in R or Python.
It depends. If you're using Python with numpy>=2.0.0 (and macOS>=14) then you should benefit greatly from Apple's Accelerate implementation of BLAS/LAPACK routines which are behind most linear algebra operations. I'm not aware of any serious public benchmarks, though.
That sounds pretty promising.
That's great - I wonder if you could get one working with Kamal [1] and Cloudflare Tunnel [2] to run public web apps from a home computer?
Yes, just using docker containers and cloudflare tunnel I am using mine as a server for my self hosted apps.
Don't forget Tailscale serve and funnel.
I use CF Tunnel to serve progscrape.com. It's easily the best way to serve from home infrastructure, IMO.
While it may not be the literal fastest CPU ever, it still seems very, very fast, and the efficiency is pretty compelling. I'm not sure how much of those efficiency gains are a product of the design constraints that Apple is not beholden to (external memory, x86 backwards compatibility, other aspects of the AMD64 architecture, etc.), the slightly better process nodes, or superior design. I'm honestly dying to know, but I guess we won't find out, and as far as the products go, it doesn't really matter that much. The end result is a pretty good deal.
As a mainly non-Apple user I see the following caveats for my own uses:
- I'd love to see better Linux support. (As far as I know, Asahi Linux only covers the M1 and M2 lines, and as amazing of a project as it is, last I looked, it's neither upstreamed nor exactly what one might consider first class. Maybe it's getting there now, though...)
- I'm worried about the SSD situation still. It seems like it hasn't amounted to much (yet), but some use cases might be more impacted than others, and once the SSD does finally fail, the machine's dead. This is not how things work in most PCs, even mini PCs, and it's a bit of a hard pill to swallow.
- The pricing is great at the baseline, but it gets progressively worse as you go up. The Apple M4 Pro Mac Mini has a baseline price of $1,399.00, which I think is pretty decent for a high-end computer with 24 GiB of RAM. But, it maxes out at 64 GiB of RAM, which is less than half of what I have in my current main machine, and believe me, I use it. That 64 GiB of RAM upgrade costs $600. For comparison, the most expensive 64 GiB DDR5 RAM kit on PCPartPicker is $328.99. Don't get me wrong either, I understand that Apple's unified RAM is part of the secret sauce of how these things are as efficient and small as they are, but at least for my main computer I really don't need things to be this compact, so it's another tradeoff that's really hard to swallow.
But on the other hand, for people happy to use macOS as their primary operating system, the M4 line of Macs really does look the best computer Apple has ever produced. (For me, it is rare that I feel compelled to even consider an Apple computer; the last time was with the original M1 Mac Mini, which I did buy, although after some experimentation I mainly just use it for testing things on macOS rather than as a daily driver machine.) There really aren't many caveats especially since the base memory configurations this time around are actually reasonable.
I suspect these things could be great on homelab racks if the longevity issues don't wind up being a huge problem.
> - I'd love to see better Linux support. (As far as I know, Asahi Linux only covers the M1 and M2 lines, and as amazing of a project as it is, last I looked, it's neither upstreamed nor exactly what one might consider first class. Maybe it's getting there now, though...)
As I understand it M3 is not supported because there's no M3 Mini to run the continuous integration
There is now an M4 Mini, so there might be a chance Asahi Linux will eventually support M4.
https://social.treehouse.systems/@marcan/112277289414246878
Edit: Turns out the lack of Mini isn't really a huge issue and it's more just that there were significant changes between M2 and M3.
https://www.reddit.com/r/AsahiLinux/comments/1g07jui/mac_min...
The SSD in the new small Mac Mini is replaceable, though it is proprietary (not standard NVMe) and uses different physically sized and shaped drives that are incompatible with each other physically between the M4 base version and M4 Pro version.
Plus it’s not like it’s designed to be user accessible. Yeah, you can get to it. But it’s not easy.
NAND card is replaceable in a proprietary socket. So you don’t have to worry. It probably is not upgradable but you can definitely replace it.
What's a CPU that is faster in some given 1T workload? I am fairly sure this thing is the fastest out there.
I don't think there's enough high quality benchmark information to really make a statement like that, but most importantly, I care about both single-core and multi-thread performance. I don't really have any workloads that only use one thread.
Comparing the M4 with PC CPUs will be hard. Typically when comparing two PC CPUs, to make the comparison more realistic, you'd set some reasonable similar constraints, like using the same memory kits and so on. However, even without considering overclocking, the actual performance of a given CPU can vary massively depending on the thermals, power delivery, memory and so forth. (It can vary by over 50%. I didn't check but you should be able to see this on benchmark charts that allow user submission.)
(However, for what it's worth, I always do at least a bit of mild overclocking personally. Nothing extreme, but what does fit within the power and thermal budget is basically just free performance at the cost of some efficiency, a trade-off I'm happy to make for my main desktop machine.)
Nah that's all pointless trivia. It is dark inside the box. Nobody gives a rip whether the mini is faster because it's got better ram or if it's faster because it's got better arithmetic logic. So you do not have to control things like memory because you don't have a choice anyway.
You don't really seem to understand the point of benchmarks. You're trying to compare the performance between two devices to quantify which one is better at some specific task in some scenario. The tricky part here isn't that people care whether the CPU is better or not, the problem is that on the PC side you can fix the variables between CPUs so that you can just look at the value of individual CPUs, but you can't do that when comparing across PCs and Mac devices. So what do you pick to compare with? There is no correct answer, but there are some answers that are more sensible than others. e.g. you probably don't want to jump massively into another price class.
If money is no object and you just want ridiculous multicore performance it's going to be pretty hard to beat EPYC. Yes, the single core performance is going to be worse; it won't probably be the best even among PC parts, but many use cases gladly take that tradeoff.
> If only they didn't put the power button on the bottom.
I can't tell if anyone is being serious about the "Powergate" issue. The thing is 5" wide and weighs 1.5 lbs, it's not exactly a burden to lift it a little. And there are highly practical workarounds: https://www.reddit.com/r/macmini/comments/1gncek7/nailed_the...
It’s not a burden.
I consider it a typical Tim Cook decision, in that the man led the company that made one of the fastest CPUs in the world, makes it draw as much as a Raspberry Pi. Absolutely crazy feats of engineering, design, manufacturing… and -
There is that ONE detail that would’ve made it perfected but it’s botched!
I don’t mind it too much, since it’s still 99% close to perfect.
But, but…
> I consider it a typical Tim Cook decision
Tim Cook cares about money and efficiency of building and moving product. That’s it. I highly doubt there’s been any important design detail about any product that he made himself.
Hah, Tim Cook decision pretty much sums it up; its the kind of thing that wouldn't have lasted 5 seconds when placed in front of Jobs (although there is a strong chance Jobs would have demanded his own nonsensical addition/subtraction to the design).
Jobs would have removed the power button entirely.
And then when there's a fault requiring a hard reset to fix you have to insert a bent paperclip into a tiny unlabeled hole on the bottom, or spell out a message in morse code by unplugging and re-plugging the power cord with some special timing. (This is not sarcasm)
Jobs would've thrown it out the window, and verbally abused the intern who brought it in, because it has ports in the front.
He would've kept the power button on the bottom, though.
Ports on the front are 100% the right decision though.
Thats the thing. You're right, he would've, but also he stopped a lot of good decisions from happening because it just wasn't to his taste.
So like, power button on the bottom isn't the end of the world.
Was Jobs in charge when they decide to place the power connector on the bottom of the "magic mouse"? But it's fine because it can fit in a manila envelope.
Jobs would have kept the button on the bottom, as it's not the proper way to use a computer.
Instead, he would have put motion/light sensors on the screen, so it would automatically wake up when you are sitting in front of it. Macs don't shutdown, they just go to sleep and wake up when you need them.
Yeah he likely would have said no ports, or lets have only one port, or he would have demanded that the Mac mini has the dimensions of some multiple of pi…
Nobody’s perfect.
It could be worse, they could’ve located it in the center of the bottom.
Do people really use the physical button that often? 99% of the time I just let it go to sleep.
>makes it draw as much as a Raspberry Pi
That's a funny comparison. They don't have power buttons at all. Without mounting, you need to pick it up to be able to remove the power supply.
And the power supply I bought with my Pi 4, at the Cambridge store, doesn't even work.
Shame they got rid of the ability to power the computer on and off from the keyboard. I know its been that way for some time, I'm sure there's good reason for it (maybe it doesn't work well over BT or something, or simply few generic keyboards offered the power button).
It does work like that in practice though. There’s is absolutely no reason to fully shut down instead of standby.
1. Given millions of things that are perfect it takes one of them for HN to lose its mind, power button happened to be it this time, Cook didn't decide that.
2. How often do people exactly have to turn off and on a mac that consumes less than a pi for them to constantly be reaching out to that power button?
3. Standby, hibernate exist.
It's not like Tim Cook personally decided to put the button there, but saying over many years he's aligned the company to be one that would leave the button there rather than bite the cost of putting it somewhere more ergonomic is something I can buy into. Seems like a way to improve margins generation over generation, which is the kind of thing he's obsessed with.
This is also the same Apple that made the G4 Cube: that felt like this in reverse, with Jobs driving them to make a capacitive touch button because of an obsession with a seamless surface.
The comment in the article is in the context of rack mounting them which is a common thing to do with Mac minis. Having it on the bottom makes it hard to press as you can’t lift them up when they’re secured in a mount.
> Having it on the bottom makes it hard to press as you can’t lift them up when they’re secured in a mount
Hard rebooot is the only situation where you should be using the physical power button on a modern Mac. If you're installing Macs on a rack, presumably you can sudo shutdown -r.
The button on the bottom is trying to tell you that the system is built to be well behaved on stand by.
Exactly this.
I am working on a solution to make it easier to hit the button from the front of a rack shelf, but the fact I have to mess with 3D printing just to hit a power button is silly.
Older Macs also had the power button on the back, which was also annoying, but at least a Mac that's secured to a shelf could have its power button pressed pretty easily.
The Mac mini _requires_ a mechanism to press up from the bottom in any permanent-ish install.
Genuine question: why not just use a managed pdu and be done with it? No need to even get up and go to the DC/Rack.
Serious question: why not mount it upside down?
I mean they aren't designed for rack mounting? It's a consumer product, likely <0.1% of units produced will end up in a rack.
I would have thought that them being slightly higher than 1U would have precluded people from rack mounting them "flat" in the first place. It seems like it would be more efficient to rack mount them standing on their sides, and then the air gap between them would be enough to reach the power button easily.
If the power button is the main gripe with this model of Mac Mini, then its doing pretty well.
Because the comment is very specifically talking about rack-mount installations. Granted, no matter you put the power switch, it's going to be difficult to reach if you install 21 of them on a single shelf.
>Apple Says There’s a Simple Reason for the Mac Mini’s Odd Power Button Location
https://gizmodo.com/apple-mac-minis-odd-power-button-locatio...
> Apple VPs Greg Jozwiak and John Ternus explained in an interview to a Chinese content creator on Billibilli (spotted and machine-translated by ITHome) that the main reason the power button is on the bottom of the 2024 Mac Mini is because of the computer’s size. Since it was nearly half the size of the previous generation, the underside was “kind of the optimal stop” for a power button. They also say most users “never use the power button” on a Mac, anyway.
> Apple isn’t wrong here. The Mac mini measures 5 x 5 x 2 inches, compared to 7.75 x 7.75 x 1.4 inches from the last generation; it takes up much less space on your desk, which is great. The trade-off is that you run out of space for some important things, like a power button.
That explanation makes no sense. There are many mini PCs of the same size that have their power button in an accessible location.
The excuse that most users never use the power button is the "you're holding it wrong" of 2024. Stop telling me how to use your devices, Apple.
The explanation mentioned on several forums that it's a cost cutting measure to avoid extruding yet another hole in the aluminum case, or routing the power cable, makes no sense either. This is a state-of-the-art machine, yet they're cutting costs on such trivialities? Give me a break.
This is unequivocally poor design. Yet Apple will never publicly admit that, and will gaslight everyone to think it's actually good, as they usually do.
They've managed to get people to accept things they'll never accept in Intel or Android ecosystem. Like no SD card, no memory expansion, no dual SIM etc. That gives the confidence.
I guess once system shuts down you can switch off the power at the mains or adapter socket.
> The thing is 5" wide and weighs 1.5 lbs, it's not exactly a burden to lift it a little.
It's the difference between being able to hit the button one-handed or needing two hands. My Mac Mini is sitting at the back of my desk, and the power button is toward the rear end of the Mac, and I definitely find it a bit clumsy to reach back with two hands, flip it over (disturbing an wires/peripherals that might be plugged in), find the button, and press it.
> And there are highly practical workarounds
Not as practical as putting the button on the front or top.
It's certainly not a deal breaker, but I do find it mildly annoying. The ideal for me would be to have the button easily accessible on the front or top, and have it behave like other devices I use: a short press to sleep/wake, and a long press to initiate shutdown. And when I'm getting up from my desk, I could give it a quick tap to put it to sleep and lock it.
My workaround is to use a keyboard shortcut to put it to sleep, which it works fine and is not a big deal. But I still think Apple deserves a bit of mockery for this decision.
And I mean, it uses 4 watts idle. If you power it off overnight for 12 hours you've saved... half a glass of orange juice worth of energy.
As with all things regarding power efficiency, you have to consider the wide use of these devices, not just the individual use.
If moving the power button there changes the behavior of thousands of people that would typically shut their computer down when they're not using it, that half glass of orange juice turns into thousands of gallons.
4 watts idle, how many watts on sleep?
I also wonder how often people are actually turning them off. It's generally a rare event to push the power button on a mac in my experience
I turn my Mac off every day
Why?
I only reboot my Mac laptop when I’m forced to due to os updates. With a Mac mini? That thing would never get shut down.
I just got one (M4Pro model).
It's pretty zippy.
I have pressed the power button exactly once, since Friday (the day I got it). All other restarts were "soft" (including a couple of crashes). The keyboard and trackpad do fine, starting a shut-down computer.
It's replacing a docked MBP. That power button was a lot more difficult to reach, and I needed to hit it more often than this.
I spent a few minutes looking up whether a Mac could be booted from a Bluetooth keyboard but couldn't find any documentation of that. Back in the day some(?) Mac models could be booted by a USB keyboard, see https://www.projectgus.com/2023/04/griffin-imate/ for technical details.
> Back in the day some(?) Mac models could be booted by a USB keyboard
Heck, way back in the day, Macs all had power buttons on the keyboard.
I just bang on the spacebar, and it starts up. It's a Bluetooth keyboard, so I guess the system is listening to BT. I did that with the laptop, forever.
The Mini starts up a lot faster than the laptop.
That said, I should actually do a test, to make sure that the system is in real shutdown...
Nah. I'm wrong. The laptop started that way, probably because the keyboard is attached to a CalDigit dock, and tapping on the keyboard probably sent power to the device, which starts it.
That doesn't happen with the Mini, if I actually do a shutdown from the menu.
Apple has something I think they call "Deep Sleep," which is basically a shutdown, and that wakes from the keyboard.
That said, it's not a big deal to reach under the left side, and tap the button. The laptop was a pain, because I had to open it up.
But I've only had this thing a few days, and haven't had a chance to really torture it, yet.
It's probably easier to find and press than the old 27" iMacs. I always had a brief moment of trouble feeling around the back to find that darn button (part of the reason is that you need to press it very infrequently).
I can't imagine it's anything but a silly comment. Macs have the equivalent of wake-on-LAN, plus you can configure them trivially to restart after power loss. The idea that you'd have to press the button often is just silly.
> can't imagine it's anything but a silly comment
Given my cat, after learning to press a button on his automated feeder, now presses anything that looks like a button with the curious expectation of food, I can only presume he got out while I was in Cupertino.
Button on the bottom isn't a design mistake. It's an opinionated choice.
Or rotate it 180 degrees. There's nothing on the top (now bottom) anyway that can't stand being covered up.
Sounds vaguely like "you're holding it wrong". Maybe it was always supposed to be placed on its side? Apple should clarify why the button is on the bottom.
this sounds like a "big-endian vs little-endian" kind of quarrel, to be honest
power button on the bottom means kids/cats won't accidentally press it
Steering wheel in the footwell means the front passenger won't sneeze on it.
Come on, it's a dumb idea. Apple has them sometimes - really!
No really, imagine a surge protector with the switch recessed on the bottom. Only plugs on the top, nothing to step on and bring down your desktop.
one of my favorite conspiracy theories i've picked up recently is that apple purposedly puts these kind of annoyances in their product.
things like the button on the bottom of the new mac mini or the dumb notch on the macbooks.
according to the theory, such things:
1. they catch attention of people and give them something to talk about (and to fight about)
2. they might steal attention from other flaws
it makes perfect sense: the notch, this idiotic decision about the button, the charge port on the mouse.
Because minis are very commonly racked in bulk, and its both very irritating for that use case, and entirely unnecessary
Minis are rarely racked in bulk unless you're running a server farm, which is not the use case they design for. The MacMini is first and foremost a desktop computer for non-professional people or at least not sysadmins. If people want to rack them, go ahead, but in that case how often are you hard rebooting a machine vs soft reboot anyways? Macs aren't known for freezing up too much.
Either way, it works for the use case its designed for.
Some of the rackmount kits for previous generations already reroute the power button and connectors to the front, like this https://racknex.com/wp-content/uploads/2023/04/with-power-bu.... (Though why not just install it backwards?) I guess they will be able to run a little lever under the M4 model the same way.
Actually, the M4 model is a little taller so it no longer fits in a 1U rack mount. Whereas before you could fit 2 horizontally in 1U, now you'd possibly fit 8 or 9 vertically in 3U. (Edit: This company says 10 per 2U https://www.racksolutions.com/m4-mac-mini-apple-hypershelf.h....)
I think the airflow for more than 3 per 1.33u, or 8-9 per 3u will necessarily suck.
I have designed for both, I think both have great use cases. 2 x 8 in 6u is really neat and tidy, I just don't love the concept of sitting the fans on their side, though I think they'll still last 5 years.
In that case wouldn't you rather have a managed power switch / iDRAC / restart over ssh, than send someone to go press buttons in person?
(I've only ever racked things remotely, so don't know if this is common.)
Can't do idrac of course but you can do managed power port, kvm and ssh.
The most efficient is managed power + serial + ssh and probably the best to go for.
Why not just rack them upside down then?
In that case then they should be on switched pdu ports and plugged into permanent kvm.
I'm sorry but this is nonsense, if you're really racking them in bulk the above is obvious.
> it's not exactly a burden to lift it a little.
Human dexterity is not constant. Some people have injuries which compromise them.
> And there are highly practical workarounds:
Apple. A consumer product company where every _single_ product has some massive defect which must be apologized around.
Which is fine.. but I'm not sure how that justifies their typical price point.
I recently upgraded from a 2020 Intel MBP to an M3 Pro. I’ve been been blown away. Even more, I’ve yet to hear the fans turn on.
I just bought an ASRock Deskmini x600 with a Ryzen 7700 to run as low-power Linux server / workstation. Given the trouble I had with this thing due to (I believe) buggy amdgpu drivers and/or buggy firmware, I'm inclined to throw it out and just buy this Mac Mini.
Well, I wouldn't consider 7700 low-power in any sense, maybe 7640U or 7840U on a mini PC. You are running Linux the entire time?
True, it was low-power in my mind compared to my previous gaming rig. Yup, running Linux the entire time.
I may have to break down after the holidays.
I have a 2015 iMac and I've been holding off (and haven't really been using my Apple Silicon MacBook as intended) so it may be time to do the upgrade.
This seems like one of the best times to do it as long as you get your own monitor. You will probably be set for 10 more years with this thing.
I have a 5K monitor for the Apple Silicon MacBook which I could easily switch to a MacMini and would be ergonomically better. My other MacBook is also about 10 years old and it's a question of how long I stretch everything out while it's all basically working even if not on the current OS.
Stupid question but why can't it (non Pro particularly) be powered over USB C like a MBP, if it's so power efficient?
Does it have extra performance that makes full use of the 155W input?
What would have it taken Apple to have given us the option to be powered by 100W USB C (eg the tech to downgrade power usage to match the power input)?
I think most of the 155W is for USB-C power delivery, not the machine itself.
Blog post mentions that power usage during benchmark was at 42W.
I guess the inclusion MBP battery allows drawing higher current than a USB C power supply can provide?
What i am thinking right now is, what we can push the M4 Chip to the Maximum of its power? i've seen a lot of people using it for LLM, making cluster with ExoLab. It's amazing how its perform with such an efficiency.
it would really be nice to run vpp (https://s3-docs.fd.io/vpp/25.02/ on this machine, packet forwarding with pico-joules/packet would be excellent.
Noteworthy because related to energy efficiency across the entire product lifecycle:
> Mac mini is:
> Designed with more than 50% recycled content. Made with electricity sourced from 100% renewables. Shipped 50% or more with low-carbon methods.
the mac mini is the first carbon neutral mac:
* https://www.macrumors.com/2024/10/29/apple-m4-mac-mini-carbo...
* https://www.apple.com/environment/pdf/products/desktops/Mac_...
I’m contemplating whether this can handle editing 8K video. Does anyone have any idea?
My goal is to venture into 180-degree VR production. With the Canon R5C and the RF 5.2mm f/2.8 Dual Fisheye Lens, I want to produce stereoscopic video at 8K. However, rendering such high-resolution footage demands substantial processing power, and my current setup definitely isn't enough.
Base version, probably not. If you went with one with M4 Pro cpu and the 48gb or greater ram you should be ok. Storage isn't as much an issue with Thunderbolt 4/5 drives easily meeting similar speeds to the internal storage.
I've done some 8k fisheye footage and converted for Vision Pro / Quest with the prior gen hardware kit and was able to edit and process it on an M2 Max with 96gb ram.
Super cool yeah I'm mainly looking to create 8K fisheye footage for the Vision Pro so that's really good to hear!
Wondering if it's worth spending that much money on the M4 Pro or just building a PC tbh.
I rarely press the power button on my laptop.
I suspect Apple tracks/measure the usage of the button and took the decision to design it hidden.
Given these efficient numbers, I wouldn’t be surprised if Apple were racking Minis for serving the new ML models for Siri.
They've publicly disclosed that they built custom Apple Silicon servers to power Private Cloud Compute.
"The root of trust for Private Cloud Compute is our compute node: custom-built server hardware that brings the power and security of Apple silicon to the data center, with the same hardware security technologies used in iPhone, including the Secure Enclave and Secure Boot."
Mac minis probably don't have enough RAM to hold Siri's models. It's more likely that Apple is using (modified) Mac Studios.
Why use Macs at all? It's well within Apples abilities to make a completely custom server motherboard built around Apple Silicon rather than hacking something together out of Mac parts. That would allow for much better rack density, and they could add proper server amenities like a BMC for remote management.
You should check these guys out. They actually do it (rack, not hosting ML models) and it seems to me to be pretty elegant: https://www.macstadium.com/blog/m4-mac-mini-review
> If only they didn't put the power button on the bottom.
Question for HN: How would you redesign the power button, assuming you work at Apple and the final design should align with Apple's design ethos?
Since I turn off my Mac almost every night, just like I turn off my television, lights, and other stuff when I don’t use them. To me, personally, it makes no sense to waste the electricity for about 15 hrs * 365 days per year.
I would put the power on any vertical side of the mini if I were designing it for my use.
I really hope you mean you unplug the power cable from the TV, cause none of the modern TVs turn off when you ask them to. The TV bootup takes too long for it to be "off" off.
No, but I most often use the hardwired button on the side of the TV to shut it down, not just the remote into standby. It takes about 15 seconds to turn on, nothing to worry about.
Any feedback on how it comes along for regular Software Development (backend stuff) activity and as a common home PC? Looking at the benchmarks looks like there shouldn't be any issue but any first hand experience would be greatly appreciated.
IMHO, developing with Node.js, Java, Python, Go, etc.. within MacOS is more convenient compared to Windows machines.
Also I can highly recommend using version managers (e.g. nvm, jenv, pyenv, gvm, etc..) for these languages to quickly install and manage different versions.
Idk about this one but my daily driver is a Mini M1, 16gb, 1to.
The regular backend stack on docker is smooth. Vs code / jetbrain is smooth. Nothing to complain about.
I could use more ports however. That is easily fixed with an USB hub.
I'm pretty happy with my M1 mac book pro with 16 Gb. I'd expect this to be faster. I typically have intellij, vs code, slack, a bunch of docker containers, etc. running. All fine. Get more memory maybe.
The M4 Pro is less efficient than the M3 though.
https://www.notebookcheck.net/Apple-M4-Pro-analysis-Extremel...
> The M4 Pro continues to be manufactured using a 3-nm process and on the old M3 Pro (27-28 watts), we measured a lower consumption than on the M2 Pro models (~36 watts), despite its improved performance. In contrast [...] the new M4 Pro can consume up to 46 watts, settling at around 40 watts during the further course—so at its peak, it consumes 60% more.
Assuming they refer to the full chips rather than the binned ones, each generation of pro chips has the following number of p and e cores:
| Model | # p-cores | # e-cores |
|--------|-----------|-----------|
| M2 Pro | 8 | 4 |
| M3 Pro | 6 | 6 |
| M4 Pro | 10 | 4 |
Thus m3 pro has more e-cores and less p-cores than m2 pro thus the big increase in efficiency, while the m4 pro has more p-cores than m2 pro thus the increase. It is all about tradeoffs and, honestly, the result is pretty much expected when you count the cores. I assume there is some improvement per generation, but if the number of cores is not constant, the latter is gonna drive most of the variance generation to generation.You're only considering the CPU though.
The numbers in the article cited by the above commenter are about the CPU.
You mean the link I posted?
They compare both CPU and GPU.
It makes sense that the Pro model is less efficient since it's more focused on performance.
Wait wait wait.
That’s comparing a M4 Pro (middle level) to a M3 Air. The Air is a lower power machine with the low spec processor.
There is no M4 Pro Air. They have to be using a MacBook Pro. That likely has a bigger display, a display capable of getting way brighter, showing more colors, better speakers, all sorts of other stuff.
That’s not a very valid comparison.
If anything, the fact that the M4 Pro gets so close to the M3 is impressive.
The M3 was on a process that was known to run hot. I strongly suspect that every M4 chip is more efficient than the equivalent M3 chip.
There are multiple comparisons vs multiple chips on both CPU and GPU. Keep scrolling.
I am not sure that is true, does it not use more peak power but get more work done leading to less energy overall for say exporting a hundred photos because it finishes quicker?
I dream of Linux on this Mac mini.
VMs work and you get amazing perf. Dream granted
Is there any noticeable latency in graphics using a VM? As far as I remember VMs were a disaster for latency some years ago (Windows or Linux host).
What's your best choice for hypervisor? Virtualbox? Parallels? Or what do you recommend for this unit?
UTM works quite well. As does parallels.
Worth noting the base model only has 1Gb ethernet, not 10Gb as mentioned in the blog.
> And the system I bought includes 10 Gigabit Ethernet and 32 GB of RAM
I thought it was pretty clear from "the system I bought" that he was not talking about the base model. And I think $100 for 10 GbE is surprisingly reasonable for an upgrade from Apple. For comparison, 10 GbE Thunderbolt adapters typically cost about $200 - and while 10 GbE PCIe cards can be bought for less, they tend to be much less power efficient and generate a surprising amount of heat.
I actually think it's very commendable that Apple even gives the option to upgrade to 10 GbE on a mass market desktop. I was recently looking to buy a non-Apple Mini PC, and while 2.5 GbE is very common now, 10 GbE is still relatively rare. The options I found were to go with a Minisforum MS-01, which is considerably more expensive than the base M4 Mac Mini w/10 GbE upgrade, or to order something slightly sketchy from Aliexpress. So as soon as Apple announced the new M4 Mac Mini, I went with that instead.
I know everyone is complaining about the power button why not just flip it upside down ?
I haven’t powered down my iMac once all year
How often are these people powering down their iMacs? Why!?
People love complaining. Apple doubles the base ram and keeps the price the same, people complain that base storage is too low. If that doubled then they would find something new.
The thermal design assumes the normal orientation. Flipping it upside down might lead to more thermal throttling.
Very very cool, but only makes it more disappointing that you can't actually use this for anything innovative, except in the Apple-approved format & use cases.
Can't upgrade any of the internals, doesn't run Linux easily, no way to use any of the internal components separately, or rebuild them into a different form factor. Imagine being able to directly mount these to VESA behind a dashboard. I have an old M1 Mac Mini I'd love to use as a NAS, but the disk is slightly too small and you can't upgrade it, so it's just useless to me instead.
Impressive to see Apple match the Pi for idle power & efficiency, but deeply frustrating to see them takes the exact opposite design philosophy.
> In 1.25U of rack space, you could run three Mac minis
I mean, 1.25U at 5" deep. Lots of cabinets are 35+" deep, if memory serves. So technically it would be 21 Mac Minis in 1.25U of space, so it's more like almost 6 teraflops. Again, button-on-the-bottom and wiring and thermals aside.
Heh, can't wait to see what contraption @Merocle comes up with for densely-packing these things: https://x.com/Merocle/status/1848975509603934478
I just sold an M1 Mini and grabbed a Beelink. It's wonderful being able to run whatever OS/distro I want on the Beelink, and it's plenty strong enough for whatever. I love this adoration for Apple, primarily for my investment accounts, but in a cloud world I have no idea why there's such a demand for the M4.
I also have a Beelink, but it skeeves me out a bit. It's probably going to drive me more towards Apple M{X} for my next home server. Even though I know those parts are also made in China, I trust Apple's sourcing more. Bee-link's stuff is so affordable that it makes me wonder why that is.
The globe continuously inching towards war makes me quite paranoid, unfortunately.
Beelink's $299 base model of their compact desktops is enough for most regular home users. Not as powerful as the Mac Mini M4 but half the price and double the storage. Plus you can upgrade your own storage to 2TB for about $100 and it'll be faster than the Mac Mini's.
Yeah. I scratch my head when there is a ton of new models and competition happening in the mini PC market that provides good value for end users, but nobody notices. When Apple releases a new mini PC, suddenly it's like a breakthrough or something. I even see people say "the entry 8 core Ryzen you can get on a desktop PC is 7700" when discussing this new Mac mini on tech forum, as if giant towers are the only kind of desktop PCs and 7840U/7840H(S) doesn't exist. (You would expect these people to know better)
I guess marketing matters.
Exactly as described. I had an 8/256 Mini, and the Beelink solved more than a few problems.
I've been lurking here for more updates on the Mac Mini M4 since I haven't bought mine yet. I also shared some thoughts in previous comments [1][2], as I'm not only impressed by the technical achievements and form factor but also interested in seeing how Apple's business evolves over the next few quarters. I'm curious whether Apple will increase its market share on the desktop side while continuing to dominate in mobile.
I use Linux, but I think the cheapest M4 Mini offers an incredible value and efficiency per €. With education discount, it's around €650, including VAT. It's pretty hard to find such a silent and powerful machine for that little. Any comparable options?
A good fanless build with a i3-14100T is more expensive and 40-50% slower on Geekbench. An i5 is a bit closer. Some 2024 Ryzen CPUs can match or exceed its multicore performance, but these are also more expensive and much less energy efficient. Pricewise, things start favoring PCs if you need more RAM, as Mac upgrades are costly.
One can potentially use Nix on a Mac Mini to keep similar development environments to those used in Linux, but AFAIK some packages are not supported on ARM. Any experiences using Nix and nix-darwin as a daily driver?
> With education discount
I don't understand why so many people use the discounted price as reference. Surely very few of us on HN are still in college? So let's use the actual price when making comparisons.
>I don't understand why so many people use the discounted price as reference.
Or when they only use it to make the Apple pricing seem more favorable and ignore it when it comes to PC pricing. Most PC manufacturers also have educational pricing, whether directly or through some portal provided by your institution. I know my son's college had a deal and also had a list of the tax free days in the state so that you could pre-order and then pay and pick up on the day the tax didn't apply.
> tax free days in the state
i'm sorry, tax free days?!? am i too european to understand this? does this apply to everything, like groceries, tech, flowers, wood etc., or just corporate transactions?
I can't say for all states but here in Massachusetts we have an annual tax free weekend where sales tax (6.25%) is not applied for "most retail items of up to $2,500, purchased in Massachusetts for personal use" (https://www.mass.gov/info-details/massachusetts-sales-tax-ho...).
Also groceries never have a sales tax in Massachusetts but, again, that varies by state.
> tax free weekend where sales tax (6.25%) is not applied
that's such a strange concept for me. i wonder what the historic reasoning there is for it, as it seems like one of those legacy things which were started to increase sales during difficult market times :D
> Also groceries never have a sales tax in Massachusetts
also interesting :) what i knew was that some or most states display the prices without tax, so you'll only know the total of your grocery trip at the checkout. never seen this here over the pond, prices always include taxes.
what's common is that different things are taxed differently. food and beverages have lower tax than non-essential things, except of course if the beverages contain alcohol, etc. yada yada blabla.
>also interesting :) what i knew was that some or most states display the prices without tax, so you'll only know the total of your grocery trip at the checkout.
It comes up on /r/askamericans all the time, but it's not realistic to include tax on the prices because there are so many different taxing zones. A large city may have multiple. Most places you can figure it's going to be ~10% and might be pleasantly surprised when it's less. Everyone knows to figure roughly 10% extra, so it's not a chore or anything, even children figure it out.
We don't have tax free weekends in Australia but fresh produce is also exempt from GST (our version of VAT). Anything that has had any "processing" done on it incurs GST though, so oranges are tax free but orange juice is not.
Education and Health are also exempt from GST.
Some places it's anything with sales tax, or maybe just goods in general if they already have low or no tax on food. Other places have it on specific goods that would be considered 'school supplies'. I think where my son is, it's a week or weekend where it's all sales tax is waived. Definitely not a corporate thing, it's to give parents and residents a break and to help stimulate the economy with spending.
And are they representation-free too?
It may be a local thing in Bay area, but usually there’s some way to get some discount when making a purchase with Apple - be it via education, or via a corporate discount (just show your badge from another company), or via a friend who works at Apple, or some big retailers start selling at good discount (eg Amazon easily gets 5-10% lower price over time).
Anecdotally, last week I visited a local Apple Store with my son who is in middle school. Without any prompting from us, the Apple rep asked my son if he is planning to go to college some day, and applied the college discount to our purchase without my son saying much…
all you need is a .edu address If I recall correctly. you can buy them on alumni addresses.
That said, a far chunk of HNs never completed college, like myself and lost access to any email accounts of this sort, which only further supports your argument directly, as the EDU discount isn't universally attainable
Regardless, if people start to abuse this by getting discount while not actually being student or teacher, we can say goodbye to that discount and real students and teachers will suffer from it.
It's not like they're taking a loss from educational purchases. It's just price discrimination. You might as well say "if we all started using newspaper coupons we can say goodbye to those discounts."
But that IS actually true, that if everyone used a coupon, the coupon would eventually go away or become weaker.
The expected percentage of people using a coupon is often part of the calculation (however vague) when deciding the coupon.
Likewise, the expected percentage of people using the education discount is part of Apple's calculation. Even if they lose money on the device (which I personally think is unlikely), they'll make it back from subscriptions and their cut of app store purchases.
It is also an investment.
Teachers promote/showcase the device for students.
Students who buy the device earlier, might influence the choices in their future jobs.
They almost certainly won’t get rid of it because people are abusing it. If they do it’s because they don’t want to offer it.
If they want to end the abuse they will simply toughen the verification procedure.
> They almost certainly won’t get rid of it because people are abusing it.
It always depends on the ratio (valid cases vs abusers), if the amount of the abusers gets too high, then the discount is not correctly fulfilling its purpose.
> If they want to end the abuse they will simply toughen the verification procedure.
It also depends on how expensive or difficult it is to maintain such verification procedure. At some point it is not justified anymore.
I just personally don't like the current attitude which seems to be going on. If you can "cheat" on getting the discount, people just keep finding reasons why they are justified to cheat. "They should toughen the verification procedure if cheating is possible".
It happens everywhere. People get praised on finding such cheats. Even in Universities, people are encouraged to cheat on getting better grades with less work. Oh, clever boy! He used different LLMs with with good context that made the output look like his own writing.
Not much different than saying "get a better lawyer", if you are getting punished for breaking the law. Opposite applies and that is why lawyers can be really expensive.
Or, not much different than big tech doing morally questionable things because the law is lagging behind. "Nobody is not enforcing the law, so it is perfectly okay. Worst case is that we need to pay some fines.".
It's not cheating if they intend for you to do it (but just can't explicitly say it's allowed because then everyone would do it and that would collapse their self-assortative price-discrimination strategy.)
> If you can "cheat" on getting the discount, people just keep finding reasons why they are justified to cheat. "They should toughen the verification procedure if cheating is possible".
You seem to misunderstand the argument. It's not "they should toughen the verification procedure if cheating is possible." The argument is that they would toughen the verification procedure if cheating were possible and they cared; which proves, at the very least, that they don't care (and potentially proves that they in fact want you to do it, at least sometimes.)
To be clear, this argument doesn't apply to bureaucracies — governmental, academic, or Enterprise — where there's so much red tape in the way of making changes that it's almost impossible to fix issues like this even if several people care quite a lot.
But this argument very much does apply to a relatively-agile, not-so-Enterprise-y-for-its-size corporation like Apple. In fact, it applies especially to Apple, who has an almost Disney-like obsession with micromanaging all customer interactions as an extended customer-lifecycle marketing opportunity. (For example: you'll never find a rotting out-of-date page on an Apple-owned website.)
Apple know exactly who they're giving this discount out to. They've almost assuredly sat down at least once and done a hand-analysis of one or more months' purchases, to determine the proportion of education-store purchases that are from genuine education customers. (Heck, they probably have gone far beyond this; far lazier corporations than Apple set up heuristics for this kind of "promotion fraud"; run continuous analyses on them; and spit out a weekly reports to mull over in marketing-KPIs progress meetings!)
If Apple's education store gives discounts to group XYZ, then you can assume that that's the intended outcome. At least under the Apple marketing department's current paradigm of thought.
> It's not cheating if they intend for you to do it
It feels like you are proving my point of people finding excuses to buy the Mac with educational discount, when they don't meet the requirements :)
The intend it clearly for educational setting. For students and teachers. You dishonor the intend if you still try to claim the discount. Whether you are punished or not.
I think you might be suffering from a categorical blindness to a certain type of thing humans do.
Let's say I own a private beach. I want to allow my beach to be enjoyed freely and responsibly by a reasonable number of people, whether friends or strangers. I don't want to constantly be cleaning up garbage on my beach. And I don't want the beach to be overcrowded when I myself use it.
So what do I do? Well, I'm sure not going to hire a bouncer to guard my beach. (How would I even tell them who's allowed in, anyway? Can you recognize "irresponsible people" on sight?)
No, instead, I will probably post a sign outside my beach, saying "NO TRESPASSING".
But I won't enforce it! And if anyone (e.g. my few direct friends who I invite to hang with me at my beach) asks, I'll tell them I won't enforce it! They can bring people to my beach if they like!
Access to the beach is now an open secret. It's something that people can freely tell those they trust about. The number of people visiting the beach will rise slowly over time. Maybe it'll eventually increase to be too much; or maybe it'll level off, due to churn in the population near the beach. (Mostly depends on how hard the beach is to access, and the demographics that live nearby.)
If some tour company tries to drop off a whole busload of tourists at my beach, though, I will most certainly kick them out, pointing at the "NO TRESPASSING" sign. (Since I don't have a bouncer, probably what I would actually do is call the cops on them.)
The cops would ask me about the people already on the beach, of course. To which I would say:
> Those people on the beach right now? They're my "friends." No, I don't exactly know them... but I know people who know them! They're "on the guest list." But these people standing by the bus over here — these are not my friends. These are people brought here by a guy trying to profit off of providing others access to my beach, which I have not granted. They are not allowed in. Nobody brought here by this bus company will ever be allowed in.
This is every underground party ever. This is every travel destination for the rich. Open secrets, with guardians who actively lie by exaggerating the restrictions or conditions in place, to keep a lid on the spread of the secret.
And this is a thing companies do constantly.
• Every store discount code given out to some YouTuber to give to people who watch their thing? Open secret. (Consider: is it "legitimate" for a discount app like Honey to find and publish those audience-targeted codes? No, probably not; Honey would be acting like the tour-bus operator above. But would the online store mind if you personally found the code and used it, despite not being a member of that Youtuber's audience? No, they'd be happy to have your business. Would they even mind if you told three friends, and you all immediately bought something? No. In fact, they'd be overjoyed!)
• The unmentioned (and implied to the contrary!) never-ending-ness of the free trial period for WinRAR? Open secret. (If WinRAR never implied you had to buy it at some point, nobody would have ever bought it; they'd just consider it freeware. But you don't "have" to buy it. It goes on working forever. Some people feel guilty or pressured, and do buy it. Others eventually discover the bomb is a dud. This is WinRAR's intended business model.)
• The CPU binning lottery? Open secret. (Did you know you can keep RMAing retail-purchased CPUs until you get a really highly overclockable one? You do now! And people have been doing this for decades! CPU vendors don't care—in fact, they want these few super-enthusiasts to get their hands on their best CPUs, since they'll probably publish some really nice benchmarks with them. Free advertising! They certainly don't want a company doing this in bulk though. That'd be way more trouble than it's worth; and then what would they do with a huge pile of RMAed known-below-average-binned CPUs?)
• How easy Photoshop was to pirate in the pre-Creative-Cloud era? Open secret. (See my sibling post.)
You can exploit any/all of these if you know (and you're not in a situation legally preventing you from doing so — e.g. corporations can't pirate things.)
And some people know; but most people don't.
This equilibrium state is exactly the point aimed for by the corporations that create these open secrets. They don't want these secrets known by everyone. (If enough people do it, then it's no longer a marketing expense, but a hole in their business model.) But they don't want these secrets known by nobody, either.
The creators of any open secret, want some deserving people to take advantage of the open secret; otherwise they wouldn't have made it an open secret. (In almost all cases, you have to actually do extra work to make something an open secret. It's extra work to carefully design and manage the "virality coefficient" of an open secret so that it'll hit equilibrium, rather than spreading to fixation or dying out. The outbound word-of-mouth advertising required to get an underground party to happen, for example, is way more work than just putting up posters! It would almost always have been easier to just have no secret at all!)
I hope you will agree with me that this dynamic exists in general.
If you do: what then leads you to believe that what Apple has here is a dumb unenforced mistake, rather than an open secret?
---
One extra point, that doesn't have a clean place to insert above: corporations are really careful with the way they structure the wording of the exaggerated-restriction "wards" shrouding their open secrets.
For a person, a "TRESPASSING A-OK" sign would just be a sign. But for a corporation, any positive criteria they give implying that a group does qualify for a certain promotion, can be taken as a legal promise on their part.
If Apple offered an obscure promotion to "anyone who can find it" — some secondary secret version of their online store that just happens to have lower prices, say — and then some bigcorp found it... and if Apple then attempted to refuse to apply those promotional prices to that bigcorp's 100k-seat volume purchase of Mac Studios or whatever they were trying to get away with — then the bigcorp could actually be in their right to sue Apple for breaking the promise they were making by having such a store available without qualification! (a.k.a. promissory estoppel.)
(To be clear, to win such a case, the bigcorp would have to also prove that they then went out and did something under the assumption that they could get those 100k Mac Studios at that price — bought 100k Mac Studio-shaped desk nooks, say — and that by being refused the promotion, this contingent action has resulted in a financial loss for them — e.g. if it turns out the 100k nooks have zero resale value, so they're out the cost of the nooks, and also have a huge pile of useless plastic it'll probably cost money to dispose of. But that's not too uncommon of a problem to have, in a big-enough corp with many async/concurrent/pipelined corporate purchasing negotiations going on. So it's something the legal departments of vendors like Apple are always wary of accidentally getting tangled up in.)
"Students and teachers" is a particularly nice/"safe" wording for open-secret shrouding language for a corporate promotion, because there is no case in which a corporation qualifies as a student or a teacher. And yet literally anyone else can become a student at any time, just by signing up for a zero-tuition-until-you-take-courses online university program and nabbing the resulting .edu email. (By the premise of continuous education/lifelong learning, we are always students!) "Students and teachers" is a group that any price-conscious motivated individual can join trivially (just like clipping a coupon!), but which keeps the corporate-buyer discount-loophole-hunters out.
That is a great write up. But I think this proves even more my point that people do anything to make an excuse for cheating :)
I agree that there might be some open secrets. This particular case is not comparable. Simply, because it does not make sense. Apple is making a harware business. They are already are giving the discount for the correct use base where the discount is an actual investment
* Students that then might pick the same hardware in the future at work, company they found, etc.
* Teachers, who promote the same hardware for students
For others, why this would be open secret? The correct user base already gets the discount. There are no benefits to give discount for others as well, even in secret. It is just loss. These same people likely would by the hardware anyway. I bet that this price difference does not make them to not buy the product.
The story you are telling is not comparable in this case. The comparable comparison would include that you allow some random people into that beach well, that you don't trust. But because the count is so small, it does not matter.
However, people start posting about your beach in social media, or even in Hacker News. Friends of friends of friends tell about their friends too. Now the beach is crowded and all randoms are the all the time! What would you do? Get a bouncer or put "a real" Trespassing sign? And even your friends can't enjoy the beach anymore.
It is all about statistics and in what direction we let these things go.
> That is a great write up. But I think this proves even more my point that people do anything to make an excuse for cheating :)
You would have an argument (not a good argument) if I ever actually took advantage of the education discount. But I don't!
(I get all my Apple computers as business-lease equipment from my employer, within which I have arbitrary IT equipment purchasing authority. And then, once they've fully depreciated, I buy those computers from my employer for a trivial sum to become my personal computer(s), and also order new current-gen work computer(s). Is this "cheating?" No, Apple loves this — my employer is paying full price, and never gets any sort of discount. And my employer also loves this — they just want me to be productive, and paying a few thousand dollars to buy whatever arbitrary equipment I requisition every two years, is extremely cheap for how much my added productivity will make them over that period. Given the different things each party in this relationship values, this is a win-win-win.)
> For others, why this would be open secret? The correct user base already gets the discount. There are no benefits to give discount for others as well, even in secret.
As roughly seven other people have replied to you: price discrimination. The user base Apple would like to help out are "individual buyers who just barely cannot afford Apple products, with a $100 discount being enough of a difference to prevent them from falling out of the funnel."
Students tend to be central members of this group; but Apple, in practice, seems to actually want to help this group as a whole.
(And why wouldn't they? It's not like they're making a loss on education-discounted sales. They're making money and getting people into the Apple ecosystem, where they'll hopefully dive deeper once they have more money!)
But there's no way to openly offer "anyone who needs a $100 discount to be convinced to buy an Apple product" that $100 discount, without either:
• sounding like you're literally calling people poor (open "means-adjusted pricing"? It's been tried; people hate it! Only ever gets aired out as a TAM-expansion tactic in markets for extremely-inelastic-demand goods with zero competition, e.g. on-patent medications.)
• or leaving a loophole for rich people to find that results in Apple not being able to milk them.
And the one thing that goes against every strand of a luxury consumer product company's DNA, is the thought of letting a rich buyer with high willingness-to-pay get away with a low-margin purchase. In Apple's business, milking one rich customer can give you the net profit of dozens of low-margin customers. (Think: convincing some Mr. Moneybags who walks into an Apple Store thinking they want a Mac Mini, that what they really need is a fully-upgraded Mac Studio.)
> However, people start posting about your beach in social media, or even in Hacker News. Friends of friends of friends tell about their friends too. Now the beach is crowded and all randoms are the all the time! What would you do? Get a bouncer or put "a real" Trespassing sign? And even your friends can't enjoy the beach anymore.
How is this comparable? As other sibling replies state, the open secret of the Apple education discount has been widely dispersed for at least a decade now. It is at equilibrium — it clearly isn't spreading to the point that "the beach is overcrowded." Ask a random person off the street — heck, ask the average person on HN five minutes before this thread started — and they would not know that Apple offers an education discount but doesn't verify academic status.
You want to know what an open secret reaching fixation looks like? Picture it being discussed in "money-saving tips" listicle videos put out by popular [i.e. tens-of-millions-of-subs] vloggers. Not even tech vloggers, either — I'm talking gaming vloggers, art vloggers, beauty vloggers, etc.
Some open secrets do run away like this — and yes, this does cause their creators to pull the plug! The Apple Store education-discount open secret is not like this.
Great write-up!
> This equilibrium state is exactly the point aimed for by the corporations that create these open secrets.
Not necessarily. You know that biological evolution is blind, but thriving in a market environment doesn't require companies to know why what they are doing is successful.
So eg Photoshop (and Windows) used to be really easy to 'pirate' by individuals. And you can argue that this was good for Adobe (and Microsoft), because it's like an informal education discount: youngsters get used to the software at home and train themselves, so that later on it becomes the obvious choice for the office.
But for the mechanism to work, Adobe doesn't have to understand the mechanism. They could just not know at all about the pirating, or conclude that it's too much hassle to chase the pirates (but be completely unaware of the positive effects). Or on the contrary, they could over-estimate the positive effects of piracy etc.
Microsoft products are trivial to pirate thanks to Microsoft Activation Scripts [1] which is on GitHub. It is inconceivable that they aren't aware of it with 102k stars. That can only be deliberate.
[1]https://github.com/massgravel/Microsoft-Activation-Scripts
I agree: I am sure that people at Microsoft are aware these days.
The first commit in the Microsoft Activation Scripts repository is from 2020. For Microsoft the dynamic I describe goes back all the way to the 1980s (and perhaps even earlier.)
Back in the 1970s and 1980s people at Microsoft might or might not have been aware. (I don't know for sure either way.) But it already worked in their favour.
My point is that the dynamic works whether or not anyone is aware of it.
> It is inconceivable that they aren't aware of it with 102k stars. That can only be deliberate.
Or it is one of the reasons why Microsoft is pushing hard to require cloud account for everything, including local Windows.
Having a cloud account is entirely disconnected from the activation state of Windows, and always will be. The activation state of Windows is a property of a Windows installation, because Windows installations — all the ones Microsoft cares about, at least — are managed (including license management!) by the IT departments of organizations; while Windows logins are managed by individual users.
Microsoft would be breaking their own business model in half if they forced each user to have a "Windows subscription" bound to their personal cloud account, instead of being able to just sign a $10MM/yr contract with Oracle or EY or whomever for a 100K-seat volume license.
Remember also that many large-scale deployments of Windows machines aren't of personal computers at all, but of:
1. workstations with non-cloud Active Directory-managed user accounts, with the accounts and data on the machine being backed up to corporate servers and thus the machine itself able to be drop-in replaced overnight without the user even noticing the change;
2. workstations with roaming user profiles configured, where many different people log in and out of the same computer throughout the day (think: computer labs, internet cafes, etc)
3. shared workstations where many employees log in and out of the same computer throughout the day (may overlap with 1) — think of the computers behind the desks at the customer-service wickets at a bank
4. machines with no logged-in users, only an AD administrator remote-managing them through domain privileges — think e.g. digital signage
If licensing status attaches to the logged-in user, then none of these use-cases work! And together, these use-cases form 80+% of how Microsoft makes money from Windows!
In Switzerland (Europe?) we can easily get cheaper price, from official resellers, than the edu price. Apple store won't match.
Pretty sure people already are abusing it.
I think this is a lot like the situation with oldschool Photoshop: for a long time, people pirated Photoshop, and Adobe really didn't care — didn't bother to do anything to make piracy the least bit challenging.
This was seemingly because they considered the amount of money they could make off of sales to individuals, to be relatively trivial next to the amount of money they could make off of corporate volume licensing; and they knew that corporations wouldn't be pirating Photoshop even if it was trivial (because corporations always have the thought of an acquisition-time assets audit on their minds.)
Apple likely thinks the same way about this education discount: all their material income comes from volume purchases or alternate distribution channels (e.g. cellular carriers for phones), or in-store sales; with online retail sales being a relatively-trivial fraction. So it doesn't really matter if they're "losing" part of their margin on these online retail sales.
(Or, if you think about it another way: this is essentially customer-driven price discrimination. Like coupons are for grocery stores. The discounted price is Apple's true price — the price that builds in a profit margin they're happy with. The higher price is pure gravy if they can convince people to part with it. They put the higher price front-and-center, and make the lower-priced offer a bit obscure. People "spending someone else's money" don't care about hunting for deals; they just want to get the thing and get out. So you can milk the gravy from them. People who hold their bank balance more dearly, hunt for the deal, and find it. Still fine; still made a profit from them!)
> Apple likely thinks the same way about this education discount: all their material income comes from volume purchases or alternate distribution channels (e.g. cellular carriers for phones), or in-store sales; with online retail sales being a relatively-trivial fraction. So it doesn't really matter if they're "losing" part of their margin on these online retail sales.
Exactly. At that point when the amount of abusers gets too high (because this will become mainstream knowledge and people think it is generally acceptable to dishonor the intention), then this will end. Or if they are able to improve the verification process with negligible costs.
So, the more people talk about "educational price", and more think that is acceptable to "cheat", more likely the count of abusers reach that threshold and good things end.
> (Or, if you think about it another way: this is essentially customer-driven price discrimination. Like coupons are for grocery stores. The discounted price is Apple's true price — the price that builds in a profit margin they're happy with. The higher price is pure gravy if they can convince people to part with it. They put the higher price front-and-center, and make the lower-priced offer a bit obscure. People "spending someone else's money" don't care about hunting for deals; they just want to get the thing and get out. So you can milk the gravy from them. People who hold their bank balance more dearly, hunt for the deal, and find it. Still fine; still made a profit from them!)
You are finding again an excuse to cheat. It is perfectly okay to take an advantage of discount if you are eligible for that. But this was not the case.
> So, the more people talk
It’s been decades at this point, I suspect we’re at an equilibrium by now.
SheerID already exists and can differentiate between alumni and current students. Apple just needs to decide it's worth it (thus far, they haven't).
I've seen this FUD repeated for a long time. Hasn't happened yet. Probably the worst that will happen is they'll start requiring some type of verification again.
20 years ago I worked Apple retail and this was well known then.
Tbh, it's not that much compared to Apple sales.
You don't even need a .edu email address. I logged in with my regular Apple account and made a purchase on the education store expecting them to ask for that or some other verification, and they never did.
They claim the right to audit purchases through the edu store and charge you the difference if you don't qualify, but I've never read anyone online reporting they've been audited/charged.
https://www.apple.com/us-edu/shop/browse/open/salespolicies/....
I can't confirm, but I had a Apple store employee tell me the same. And then I forgot to use it when I ordered my Mac Mini.
>which only further supports your argument directly, as the EDU discount isn't universally attainable
Pay someone with an edu account to complete the purchase for you. Also, they are commonly available for community college students, including those taking free classes.
I’m going to have to check this out, I no longer have access directly to my old college email, but it still forwards to my gmail over a decade later!
Does that mean the discount is only for Americans? I don't see anything about it on the Dutch version of the Apple site.
Still looks pretty affordable. Until you look at the upgrades. € 230,00 for +8 MB RAM?! There are places you can get that for a tenth of that price.
I suppose doing your own upgrades isn't an option anymore? (My last Mac was a 2011 unibody.)
One thing that will potentially future-proof the new Mac Mini is that the SSD is on a removable board. It's a custom Apple design but someone's already hand made their own upgrade. Wouldn't be surprised if there will be 3rd party upgrades commercially available within a year.
Since the first ARM systems (may be before) you can’t upgrade things on your own, i had an Air which can be SSD upgraded but not memory upgraded. Memory can only be upgraded from factory.
Unless you are dosdude on Youtube
"MacBook Air 16GB RAM Upgrade"
You could just solder on a new ram set
The amount of people for whom this is a "just" kind of task is very very low. I don't think "just" should be in that sentence ;)
"just" at home: https://www.youtube.com/watch?v=KRRNR4HyYaw
Previous discussion: https://news.ycombinator.com/item?id=41631130
They have variations of the program in some European countries. It's been a long time for me, but in the UK they used to just whitelist university domains (we didn't use .edu TLDs either).
We use .ac.uk though (and much of the non-USA world uses .ac.ccTLD similarly) so no need to whitelist individual university domains. I don't know about Apple, but that's a common approach. (And does irritate some where they don't use either and get missed, Canada for example.)
Some countries place no restrictions on registering domains under their .ac second level domain though.
AFAIK, in most of the EU they validate you are a student or an academic through UNiDAYS.
€23 for 8 MB of RAM sounds a bit too expensive, to be honest.
> € 230,00 for +8 MB RAM?! There are places you can get that for a tenth of that price.
"Comparing our memory to other system's memory actually isn't equivalent [...] because of the fact that we have such an efficient use of memory, and we use memory compression, and we have a unified memory architecture."
- Bob Borchers, Apple vice president of worldwide product marketing (who apparently never heard of zram)
But what's the point in Borcher's comment? Because there's efficient software use of memory, it's legitimate to put a tenfold price marker on the hardware?
Yeah, that doesn't explain why the Intel Mac Pro cheesegrater wanted $3,000 for 160GB of socketed RAM that OWC would sell from the same manufacturer, same speeds, for $1,000 for 192GB.
Sorry Bob, architecture may be different now, but Apple has always been egregious.
windows also uses memory compression
Because when one configures it with reasonable 32GB RAM/2TB SSD and EU prices, it suddenly becomes £1800 and it's harder to convince anyone of its price superiority.
Take a pottery course at community college. Pay $65 in tuition, get your education discount, and um, relive the scene from Ghost? :)
They use it because they have an inherent bias and want to put the Mac in a position to compare favorably.
A lot of people have a relative or something still in education, just buy it through them. It's not like this is government subsidy, just a promotion to increase sales and maybe hope to have long term customer by hooking them at younger age. Probably much less immoral than blocking ads on YouTube.
I think you are missing the point. The person who mentioned educational pricing was asking if there are any machines with comparable performance and silence for that little a price, and said that the educational price is €650.
Suppose I know of a non-Mac that has similar performance and silence for €1000 non-educational. To decide if that meets the requirement I'd need to either look up the non-educational price of the Mac to compare to €1000 or I'd need to look up the educational price of the €1000 machine (if it has one) to compare with €650.
They are more likely to get useful answers if they post the non-educational price so that people don't have to do extra work to figure out if they should respond.
Failing to block ads is immoral.
Failing to block ads is also bad security practice.
It seems viable that doing so increases the price for those of us who don't.
The typical .edu discount from Apple on largish purchases is about $100, regardless of whether that's a $600 final tag or a $2000 final tag. So, somewhere between 14% and 5%.
If Apple sells 50% of Macs to the .edu discount market, that's a difference to you of somewhere between 2.5% and 7%.
Or, you can accept that Apple's prices are not set by the market so much as by their marketing department.
I believe build-to-order upgrades are also discounted, so it may not be a fixed discount.
So is paying the full price and signalling to Apple "we can afford it just fine, don't sweat about cutting margins or lowering extra disk/RAM pricing", but I don't see you complaining about that :)
Why would that be?
Companies tend to focus on the overall % profit margin for a product. If a higher percentage of sales are for a discounted (edu) SKU with lower margins, they will tend to raise the price of the product to hit their desired profit margin.
e.g. If a company was selling a product at $1000, and wanted to offer a 20% discount for EDU that would be bought by 50% of the market, they would need to raise the price by about 10% to keep the same margin. If only 20% of the market bought the discounted SKU, they could keep the same margins with only a 5% (?) increase in price to the rest of the SKUs.
You are free to purchase as many Apple products as you want to offset any perceived revenue losses from promotional discounts. I'm not so sure why you would want to do this but I keep hearing that behavioral economics is a thing, maybe paying more is your definition of rationality.
It’s basic market economics. More discounted purchases tends to lead to an increase in the non discounted price. Of course, that’s baked in at outset. Apple knows x% of sales come with a edu discount so the non-edu price is offset to account for the edu discount. I don’t have any problem with a vendor doing that. It’s how they forecast a profit margin. Apple, apparently, has allowed “people who know someone in education to also claim an edu discount” to be part of their pricing model that ultimately leads to increased prices for those those do not know someone with an edu email.
It’s trivially easy to obtain the discount. Anyone working in education, or a student at any level, k-12, higher ed, graduates with access to uni email can get it. Apple doesn’t ask any questions or for verification.
They also go on sale at a similar price to the general public relatively frequently.
We're all students of Internet University.
If you continue reading the sentence, it gets even more bizarre:
> it's around €650, including VAT.
Whatever taxes and discounts apply to the commenter’s own idiosyncratic situation have nothing to do with the price of the product.
A couple of years ago, I might have cared what the price of an M2 with Pasadena sales taxes was since I lived there at the time, but I sure wouldn’t have included them when talking about Apple prices here.
Similarly, VAT costs are between you and whatever jurisdiction you live in that’s levying them. Apple isn’t the one to thank or complain to about them.
Outside of North America, it would be bizarre to not include VAT. Anytime you see the € or £ symbol, you can assume it's VAT included.
I am outside of North America and have been for about 3/4 of my adult life.
The issue with adding VAT to prices on a forum with people living in a lot of different places is that VAT rates vary greatly from place to place.
To get an idea what an Apple product costs, it's more helpful to look at the price charged prior to taxes, tax deductions, educational discounts and other factors that will depend entirely on the specific cases of each reader.
It might be worth noting that outside North America, we all see and think about prices post-tax.
The fact tax was mentioned at all was likely for your benefit, with the knowledge there are a lot of North Americans here…
What is the point of not including VAT? You still have to pay it. It is not like you move across the continent just to pay a few percents less.
You're saying this to someone who twice took a 24-hour train from Beijing to Hong Kong to buy an Apple computer for 27% less due to HK not having an additional electronics tax.
A lot of my friends in Taiwan used to buy macs in HK for the same reason.
Except for burgerland, EU prices include taxes and are pretty much same in all jurisdictions, especially for electronics.
This is most likely because OP used Euros. In Europe, prices are listed including VAT. So in day to day life, you only see prices with VAT for your country included.
I would never compare prices without VAT.
Shush ;)
But also a ton of people are absolutely in college, at any age, new people are coming to HN every day; I'd think HN is an easier place to discuss and explore vaguely tech/startup related topics than in school
Educational discount is a pinky swear that you are a student or buying it for a student.
So it is in practice available to everyone if you want to ruin nice things for students.
Don't you usually need to verify somehow, like an .edu address?
Canada also does not verify, don't ask me how I know. However, where I live now (Spain) they do have some 3rd party service verifying edu status.
I am actually currently a college student, but I can confirm that Apple did nothing to verify that this was the case when I bought mine.
That may be so in America, but in Germany I had to log in through my university‘s system in order to verify my student status.
Wow, really? Thanks for the tip.
Even Apple Store employees will freely give you the discount. Apple doesn't discount because they aren't a discount brand, but they will give you this discount if you ask.
But a ton of us have a friend/kid/relative in HS or college ...
The cheating/fraudulence encouragement in this thread is disgusting. You guys are not stilling a pencil but several hundreds dollars. Paying a part of it or apple being fatty rich doesn’t make it more honest.
I don’t think I ever encounter here such collective encouragement to bypass a law (ok probably for jailbreak which is not a fraud). Not sure if the demography changes, societal culture change or just luck.
Edit: Oh and yeas I never completed college, don’t own a .edu and am maybe just subconsciously jealous.
Corporate perks brokers and Amazon also give 15% off.
But anyways, 15% is pretty much in the error fuzz range of "which platform do I like better" and "dominated by RAM upgrade price.
very few on HN pay full price for Mac products ;)
> Surely very few of us on HN are still in college?
What makes you think that? I'm back in school getting a MEng degree in my 30s.
I mean... surely that's not very common among HNers.
> Pricewise, things start favoring PCs if you need more RAM, as Mac upgrades are costly.
That's the position I'm in, along with some other people I've talked to recently, too.
For our situations, the M4 would likely offer more than enough processing power, and the efficiency and physical size are attractive, but a maximum of 32 GB of RAM definitely isn't sufficient.
The M4 Pro's 64 GB of RAM is somewhat better, but the cost of those upgrades are very hard to justify.
I'd also prefer to use the system for at least 5 years, and likely up to 10 years, if not longer. Even if 64 GB is tolerable now, I can easily see it becoming insufficient for my needs before then.
The lack of reasonably-priced internal storage, while easier to work around than the lack of sufficient and reasonably-priced RAM, doesn't help matters, too.
Even if future Studio models, for example, might allow for a more ideal amount of RAM, I have to expect that unjustifiable upgrade costs will likely still be an issue, and then there's the wait on top of that.
I can easily see myself and the others I've talked to settling for PCs, rather than making unjustifiably-expensive Mac purchases.
In same boat, I have the 5950x with 64gb memory running PopOS and there are times I'm hitting swap a lot more than I'd like. 16 or 32gb of memory is just not feasible, and even 2TB of storage would likely cause headaches, I have a 4TB and a 2TB nvme at the moment which will come with me next upgrade.
I'm leaning towards an upgrade next year to the 9950x3d if reviews pan out. Sure, it's going to be a bigger machine with louder components, but the upgrade will likely be half the cost of anything close from Apple since I can take my existing GPU, PSU and storage at the very least along with me.
And "upgrade costs" is highly misleading for most of the components. You are buying a different machine config that you can't change, up or down, later on. I get that most people don't want to bother opening up a PC to swap out components, but the easier they made it, the more people will do it, and Apple is running the other way.
Does your existing motherboard allow for more than 64GB ram? The 5950x itself supports up to 128GB: https://en.wikichip.org/wiki/amd/ryzen_9/5950x
Mine doesn't but yes I could move to a mATX or bigger board to unlock that extending its life. I tend to go for the 'smaller' ITX cases and boards, so currently have a x570-i setup maxing out at 64GB.
For storage at least, you can pop your existing nvme drive into a thunderbolt enclosure and use it on a mac mini. Over TB4, it should run at the drive's full speed (so long as you get a decent enclosure).
It won't help the RAM situation, but storage at least is upgradable like that.
Be careful to check the support for larger ram on the motherboard as well as cpu - I’ve got an am5 setup with 128gb of ram but the it had to be down locked to even post.
Memory usage is not comparable across Linux and Mac. MacOS is much better at avoiding swap, uses memory compression, shared frameworks etc. At the same time it tries to use all the memory available which makes direct system-wide comparisons not accurate. A good rule of thumb is that 8GB on Mac == 16GB on Windows/Linux.
Compared to Linux that's flat not true. Someone has been blowing marketing up your ...
As a person who uses both, daily, its kinda true.
MacOS does seem to “use all the ram” but never falls over itself.
I think the kernel is likely genuinely better in low memory conditions (its hard to be worse than Linux here to be honest) - and thats combined with being aggressive about using as much of the ram as there is available opportunistically. (not fully unloading applications when closing them for example).
“WindowServer” uses 2-3G of ram, and electron apps use lots too; but truthfully my macbook is able to sustain significantly more open programs than my linux laptop, despite my linux laptop actually having more memory. (32G vs 24G for the Mac).
I cant explain it and I am genuinely curious how this is the case, but at least anecdotally, parent is more correct than not.
You can also compress memory on Linux with zswap or zram.
For what it's worth, the apple silicon machines are much more efficient on RAM than most - a 16gb m1 absolutely mops the floor with the 32gb of ram I have in my thinkpad with an i7. It's not really even close.
Your comment might win you the argument on a random non tech forum but not here.
much more efficient in what? mops the floor by what? which year's i7?
Don't get me wrong, I 100% believe what happened, but if you mean "my macbook is faster than my i7 thinkpad" you should use those exact words, but not bring RAM into this discussion. If you want to make a point about RAM, you need to be clear about what workflow you were measuring, the methodology you were using, and what the exact result is. Otherwise your words have no meaning.
Repeating what I just commented elsewhere, but Mac uses several advanced memory management features: apps can share read-only memory for common frameworks, it will compress memory instead of paging out, better memory allocation, less fragmentation.
Bandwidth for copying things into memory is also vastly faster than what you get on Intel/AMD, for example on the Max chips you get 800GB/s which is the rough equivalent of 16 channels of DDR5-6400, something simply not available in consumer hardware. You can get 8 channels with AMD Epyc, but the motherboard for that alone will cost more than a Mac mini.
Sharing read-only/executable memory and compressed memory are also done on Windows 10+ and modern Linux distributions. No idea what "better memory allocation" and "less fragmentation" are.
800GB/s is a theoretical maximum but you wouldn't be able to use all of it from the CPU or even the GPU.
https://www.anandtech.com/show/17024/apple-m1-max-performanc...
>apps can share read-only memory for common frameworks
How is that different from plain shared libraries?
System design and stability. On MacOS a lot is shared between applications compared to the average Linux app. Dynamic linking has fallen out of favor in Linux recently [1], and the fragmentation in the ecosystem means apps have to deal with different GUI libraries, system lib versions etc, whereas on Mac you can safely target a minimum OS version when using system frameworks. Apps will also rarely use third party replacements as the provides libraries cover everything [2], from audio to image manipulation and ML.
[1] https://lore.kernel.org/lkml/CAHk-=whs8QZf3YnifdLv57+FhBi5_W...
[2] https://developer.apple.com/library/archive/documentation/Ma...
People who need 64GB+ RAM are not running 1000 instances of native Apple apps. They run docker, VMs, they run AI models, compile huge projects, they run demanding graphics applications or IntelliJ on huge projects. Rich system libraries are irrelevant in these cases.
This thread started as question on how MacOS is more efficient, not the usefulness of more RAM. In any case, you might still benefit from the substantial increase in bandwidth and lower system / built-in apps memory usage, plus memory compression, making 16GB on Mac more useful than it seems.
I can run apps with 4 distinct toolkits on Linux and memory usage will barely go past the memory usage of opening one Facebook or Instagram tab in a browser. Compared to compiling a single semi-large source file with -fsanitize=addresses which can cause one single instance of GCC or Clang to easily go past 5G of memory usage no matter the operating system...
I'm talking about memory bandwidth - maybe your workloads don't take advantage of that but most do and that's why apple designed their new chips to take advantage of it.
Bandwidth doesn't replace size, these are orthogonal parameters.
I never said it did.
So why are you commenting in a thread about size?
With what kind of workload? If you need 32GB of RAM then the computer that actually has that much RAM should be almost guaranteed to be quicker.
Video Editing. Backend and Frontend development utilizing docker containers. Just browsing the web with tons of tabs. Streaming video while doing other stuff in the background. Honestly most things I'd rather do on my M1.
So probably nothing that actually needs more than 16GB of RAM then. And realistically comparing M1 to an i7 several years older than it.
Having more RAM doesn't increase memory bandwidth and having more memory bandwidth doesn't necessarily mean better performance. You aren't even able to make use of all of the bandwidth your M1 is capable of in the real world [1].
Apple Silicon has good perf/watt but the gap probably isn't as big as you're thinking.
[1] https://www.anandtech.com/show/17024/apple-m1-max-performanc...
When did I say having more RAM increased memory bandwidth? Are you having a separate conversation with yourself right now? I feel like you might have misinterpreted what I originally said and just ran with it.
There is a lot of misleading information about Apple's unified memory out there so I mentioned it just to be clear.
I have a M2 pro with 32G and I hit memory limits just from web browsing.
Not sure what you mean by 'efficient', they are faster for sure (amazing memory bandwidth thanks to on chip memory), but to my knowledge they would be the same for amount of data stored. So that same think pad will likely be faster at tasks that need 24GB for example, highly depend on the use case as always.
Memory requirements for general-purpose desktop usage usually don't come down to a single task with a large working set that needs to fit in RAM in its entirety. It's more often a matter of the aggregate memory usage of many tasks, which means that in practice there's a wide gray area where the OS can make a difference, depending on the effectiveness of its memory compression, swap, signalling memory pressure to applications, suspending background tasks or simply having fewer of them in the first place.
Good points well made, macOS likely is more 'memory efficient' in that regards.
I just struggle to believe this. What OS is the Thinkpad running and did you put everything there (i.e., no corporate crud?).
I run Ubuntu on my Thinkpad - I generally notice the biggest difference with video editing, but really multitasking anything is night and day because of the memory bandwidth. I use the same software on both machines for video editing, Davinci Resolve.
Nix works well on mac, very similar to Nix/Linux for the most part. There are some missing packages, but the common ones tend to be fine. Its worth using the Determinate Systems installer to avoid reinstalling Nix on every macOS update though.
Nix-darwin is good, and I use it, but it is nowhere close to NixOS. I think there are some options I've set through it that macOS keeps overriding, so the declarative configuration drifts from the real one eventually
I think the only real issue with Nix on macOS is that Nix can eat through storage quite quickly, and storage upgrades are pretty expensive on Macs. This might push the balance back to an fanless ryzen build
> Its worth using the Determinate Systems installer to avoid reinstalling Nix on every macOS update though.
I've had Nix installed from the traditional installer since the very first M1 and never had to reinstall nixpkgs across OS updates.
I guess some people have been addressing the clobbering of /etc/bashrc with full reinstalls?
> I think the only real issue with Nix on macOS is that Nix can eat through storage quite quickly, and storage upgrades are pretty expensive on Macs. This might push the balance back to an fanless ryzen build
Only if you want to be able to roll back multiple versions. Otherwise, I think it is fine.
Indeed I've been liberally using nix-collect-garbage -d after every darwin-rebuild switch without issues for years.
I've been using Nix on macOS for almost a year. The good (and bad) thing about Nix is that it supports many different use cases, so you have to spend some time understanding the options before you can even figure out which flavor to install.
A good way to get started is to start using Nix to replace/supplement Homebrew. You can install Nix in addition to Homebrew and have some packages installed by one and some by the other. You can uninstall a Homebrew package and then reinstall it with Nix. You can even remove it with Nix and go back to Homebrew if you like.
I would generally recommend the following:
1. Use the Determinate Systems Nix installer, see https://zero-to-nix.com/start/install
2. Use "Flakes" (unfortunately the core documentation isn't updated for flakes)
3. Use "Home Manger" -- I would recommend the Flakes-based "Standalone setup": https://nix-community.github.io/home-manager/index.xhtml#sec...
I would wait on nix-darwin until you are sure you need/want it. (I have recently started using it for its support of the `linux-builder` feature, but not everyone needs that.)
As a software developer who uses macOS to develop for Linux, it is a great tool and I cautiously recommend it to those who are willing to deal with some learning curve and frustration.
I haven't yet used nix-darwin enough to make a recommendation one way or another. (But the `linux-builder` feature is compelling if you need it: https://nixcademy.com/posts/macos-linux-builder/)
A comparable option in my opinion would be Minisforum 790S7. They also have a separate mini-ITX motherboard from that one if you want to DIY.
The CPU in it is faster in raw multi-thread performance, single-threaded it's a bit slower, but still quite impressive.
The only problem I had with Minisforum is that they couldn't supply the exact hardware I ordered and their suggested solution was either to wait for 1+ month or get a sligtly different configuration. Two times out of two.
Quality-wise they're pretty good though, no complaints there.
I can't find a Minisforum 790S7 for anywhere near the price of the base model mac mini. I am seeing $459.00 USD and that is "BAREBONE (NO OS/RAM/SSD)" [1]. I am comparing this to the M4 Mac Mini base model, that does indeed come with an OS, RAM, and an SSD[2] at $499 USD.
[1] https://store.minisforum.com/products/minisforum-mini-itx-pc...
[2] https://www.apple.com/us-edu/shop/buy-mac/mac-mini/apple-m4-...
Welp. You're definitely correct. But that's the only machine in my opinion that comes close (and offers some advantage like a whole PCIe 5.0 x16 slot). There are other mini PCs that are cheaper, some other commenter suggested Beelinks which are also quite popular among enthusiasts, SER8 for example: Ryzen 8745HS, 24 GB RAM and 1 TB SSD for 467 Euro. Seems competitive enough.
Maybe it's not performance-comparable but $284 (BF35 coupon discount from $319 list) for Ryzen 5, 16GB RAM and 1TB SSD [1] in my mind is a good value trade-off versus the Mac Mini. The only thing that gives me pause is the concern expressed by some that Chinese MiniPCs are susceptible to Bios malware. I've looked into Coreboot, Libreboot and System 76 open firmware to mitigate the risk of infected Minisforum firmware but there's always the possibility of it crippling the device which would be a big time-loss more than anything.
Other flavors of malware are easily removed with a quick Windows reinstall before use but potential firmware infections are a good reason to pay more for mainstream PCs.
[1] https://store.minisforum.com/products/minisforum-um760-slim
That's a valid concern but I personally avoid going into this rabbit hole just for the sake of my (already fragile) sanity.
Speaking of issues, the other one with these low cost mini PCs is low-quality SSDs. The one that my UN100D was supplied with was pure garboleum in terms of speed so it had to be replaced.
I've recently bought two minisforum PCs on Amazon. I fully expected the SSD to be garbage and to throw them out. To my surprise, they were decent-ish Kingston TLC PCIE 4.0 SSDs. Definitely not the cheapest SSD on the market.
The Minisforum needs an external power brick which probably almosts doubles the size.
The Mac Mini does not need any external power adapter which is quite amazing.
Beelink EQR6 has an internal PSU and is also quite small, a bit smaller in footprint actually. It even comes with two full-size m.2 slots and expandable RAM.
Mini is great, exceptionally so, I actually just got a rather souped-up one (that's the reason I'm in this thread) but x86 vendors are catching up and there's a certain possibility that more established brands will pick up.
An external power adapter means a DC power input, which can be upgraded with a LiIon battery. It’s more expensive to do for AC.
That’s a very nice upgrade in many places, and for many scenarios. A power brick, meanwhile, is easy to hide out of sight.
For the base model, but any upgrade on the Mac will kill that advantage instantly. For those keen enough to solder better parts on it the Mac Mini base model is the worst kind of barebone, filled with components that shouldn't even be produced anymore.
> components that shouldn't even be produced anymore
That's kinda harsh. For what it's worth, the base model isn't that bad and the storage can be (theoretically) upgraded down the road, even though it might cost a fair bit more than a standard m.2 SSD.
Sure, in raw compute it's slower than competition, but objectively it's still plenty fast and more trustworthy in terms of reliability.
That's a good suggestion. How are the cooling systems supplied with Minisforum in terms of noise and efficiency?
I can only speak of experience with their UN100D mini PC which I use as a home server in a fairly constrained closet. Not a hot machine by any means, far from that, but the cooling seems pretty decent even for this low power CPU.
BD790i that I just received (didn't even install anywhere) has a rather substantial heatsink over the CPU similar to high wattage GPUs. The chip itself is rated at 55W TDP so it shouldn't be that much of a problem cooling it. The motherboard doesn't come with a fan and I'm definitely not going to spend extra on high-end ones at least just yet.
I don't know anything about the thermals of 790S7's case though. It looks like they gave it at least some thought judging by the duct over the CPU, but how it actually performs I have no idea.
> One can potentially use Nix on a Mac Mini to keep similar development environments to those used in Linux, but AFAIK some packages are not supported on ARM. Any experiences using Nix and nix-darwin as a daily driver?
Been using that ever since M1 became a thing; nothing worth mentioning, "not supported" is vanishingly rare in practice.
Seconding this.
Made the switch back to MacOS with the M series after being on x86 and NixOS prior.
The experience was refreshingly tepid.
The configuration being tested by OP is a $1499 model.
Nix support on Darwin is really good in my experience. Though it does happen that some packages are not supported.
Next to that I run a NixOS VM on https://getutm.app/ using Virtualization Framework. Performance is great.
You're one setting away `virtualisation.rosetta.enable = true;` to also use that VM for x86_64 packages and builds.
I have a WIP PR for Rosetta AOT caching on NixOS as well: https://github.com/NixOS/nixpkgs/pull/330829
How is the graphics/desktop performance in VMs, do you get accelerated graphics with NixOS?
Beelink has dropped the price for a roughly comparable unit with an Intel Core Ultra from 800 US Dollars to $700.
This is the best answer.
The Beelink (and other mini-PC brands) offer comparable performance.
The fact they offer lots of different configurations lets you choose your own trade-offs.
Assuming you don't have an operating system preference, the base model Mac Mini is tough to beat outright, but as you upgrade it there are other options that get interesting.
> The Beelink (and other mini-PC brands) offer comparable performance.
[0] has the new (base!) M4 at 3859 (single) and 14837 (multi) whilst [1] has the Ultra 5 125H 4500 at ~2200 (single) and ~11500 (multi). "comparable" is doing a lot of heavy lifting in your sentence.
[0] https://arstechnica.com/apple/2024/11/review-the-fastest-of-...
> new (base!) M4 at 3859 (single) and 14837 (multi)
I think that's the M4 in the base iMac (8 core), which is different (and less powerful!) than the M4 in the base Mac Mini (10 core).
The Beelink is 32Gb RAM + 1TB storage though.
Like I said: the base Mac Mini is tough to beat!
Beelink are awesome little machines!
After experiencing NixOS it's hard to settle for anything less. Nix is only good for running home-manager; nix-darwin is mostly a joke (I don't mean disrespect, I appreciate the work devs are doing, but limitations of the platform cripple the entire experience).
FWIW, Mitchell Hashimoto runs NixOS VM on his Mac for development. And that's the option I'm gonna implement once I get my MBP from repairs.
There was a comparison mentioned on last week’s episode of the Accidental Tech Podcast, I don’t remember if it was pointed out to them or they noticed it themselves.
Base Mac mini: $599, 16/256 GB
Double storage and ram: $600 upgrade.
Price of 32/512 config: $1199
Two 16/256 machines: $1198.
The ram is (Apple) reasonable at $200, but $400 for the storage doubling is insane.
While I am sure there are performance differences, I picked up 128GB of DDR4 a few months ago for ~$200. That is some wild margin.
The differences actually are quite huge. As far as I know, the M series chips all use LPDDR5 RAM, which is indeed more expensive than the DIMM/SO-DIMM modules you would add to your diy build.
Still, you can easily get a good kit of DDR5 DIMMs for 110€/32GB from a retailer. So while LPDDR5 RAM is more expensive, it is most certainly not expensive enough to justify a 230€/8GB price as being driven by BOM costs
Also, the 256 uses 2x128 drives while the 512 uses a single drive, so you even get slightly slower storage with the upgrade. The base model is a great deal.
This is true for the Mac Studio but not for the M4 Mac Mini -- they all have a single storage slot and the only difference between the 256 and 512 models is the model of the NAND chips.
You're right thanks. The change was to 2x128 NAND for the 256 drive from the previous generation, not 2xSSD.
Do you have a source on that? It's not what I've read elsewhere, but I'm having trouble finding anything specific.
I had to look it up, and I had gotten that slightly wrong. The 256 did switch to 2x128 NAND while the previous model used 1x256, but the 512 is also multiple. https://www.macrumors.com/2024/11/08/m4-mac-mini-modular-sto...
I think you have it the wrong way round? Ram is 400 (200/8GB) storage is 200
Oh do I? Well somehow that seems more insane to me.
At this point maybe the main benefit of an x86 machine is it has a spot to plug in an Nvidia GPU :-)
That and Linux support is better, if you need it.
And you can pick up a used one for less than 1/10 the cost of a Mac Mini and it will be perfectly adequate for most people.
I appreciate the technical achievement of the M4 Mac Mini for what it is, but an old x86 machine is more than adequate for my home or even work needs.
I use nix-darwin on an M2 as a daily driver. It works great! A few quirks you need to go with Brew (mostly graphical applications), but my setup is almost identical between my NixOS and my nix-darwin setups other than that (and some OS toggles).
While I suspect it is silent for most tasks, the M4 mini isn’t fanless.
For my music studio I’ve enjoyed the M1 mini since it is totally silent and am eager to read some noise tests on the new M4 mini.
In Germany you can get the cheapest (base version, 16GB RAM/256GB SSD) M4 Mini for 579€ via Unidays edu discount (also including VAT).
I picked mine up from the post office yesterday, it's 50% faster in Geekbench single/multi-core CPU benchmarks than my M1 Pro Macbook Pro and about as fast in GPU performance. Impressive.
Daily nix user across Mac and Linux, though I use Mac for actual development. No problems here moving between the two with my dev env defined on GitHub [0]
How do you handle the different keyboard layouts (cmd and ctrl on Mac, ctrl and superkey on Linux)?
I'm using a Mac at work and Linux at home with a programmable keyboard but I didn't find a solution to "merge" cmd and ctrl on Mac, so I still need to use both on Mac (not a big drama, but slightly annoying).
My half solution is to use a keyboard that physically feels quite different to help my brain use a different mode. The Linux keyboard is a big heavy mechanical keyboard while on the macbook I just use the built-in keyboard.
It's not a perfect solution and I still make mistakes, but it helps.
>One can potentially use Nix on a Mac Mini to keep similar development environments to those used in Linux, but AFAIK some packages are not supported on ARM. Any experiences using Nix and nix-darwin as a daily driver?
Would this run anything Docker/ARM?
My entire home server setup is Linux/Dockerized and the Mac Mini hardware looks so good, but the more I read about MacOS as a server OS the worse it seems to get.
Maybe for a little server or something, but with the hard to upgrade 256GB storage.. I don't get the appeal. Also 16GB of memory is extremely limited these days. Again, perfect for a little server, but not for a daily driver.
For someone who wants to develop and build apple ecosystem apps or cross-platform an existing game this is a steal.
It's a desktop computer. What's keeping you from using an external SSD?
For large media, sure, but I really don't to pay a premium to then need to manage where my everyday apps are installed cause I only have 256GB of storage I can't upgrade.
The storage is easily upgradable on the new Mac mini as it’s using a standard M2 SSD (a short version)
The video is desoldering chips, getting new compatible ones and soldering them back (exactly like Mac Studio).
I wouldnt call that easy. Most people would call that the exact opposite.
Except you have to use proprietary SSDs otherwise soldering is involved. 1/10
>With education discount, it's around €650,
It is €579 including VAT. The Educational discount in US is only $499. €650 sounds wrong. Where in EU is that.
Data point: base model Mac mini is 719€ in Spain, 599€ with the educational discount.
There is a linux for apple silicon called ashani linux.
Pretty sure only M1/M2 supported, so none Apple's new offerings will fly...yet.
Shame, I'd love to use Linux on Apple's latest and greatest MacBooks, but will stay with tried and true Dell Precision series until the year of the Linux Apple laptop becomes a reality.
I can at least attest to linux nix on arm working pretty well as I use NixOS asahi linux. Unfortunately M4 support is not there yet.
Asahi M4 support and 3rd-party SSD upgrades[1] will result in me buying an M4 Mini. Glad to hear that Nix works well on Asahi!
[1] https://wccftech.com/m4-mac-mini-ssd-already-modded-to-2tb-s...
650$-€ for 16GB and 256GB? No. Hell no.
> Any experiences using Nix and nix-darwin as a daily driver?
Not positive ones, even on x86 Darwin. Homebrew feels a lot more stable, which is a decidedly concerning thing for the average Nix enjoyer.
What kind of issues are you having?
Well, missing packages for one. Nix prides itself on having one of the most compete package catalogs of any Linux package manager, and on Mac it leaves quite a bit to be desired. A lot of functionality has to be hooked in via home-manager scripts that are a lot less stable than NixOS modules, and since your system isn't built/upgraded by Nix you can't write overlays for a lot of Mac software. If you only need the versioned Flake management then it might be an okay option, but I found myself pretty frustrated by it in places that aren't an issue on Linux. I can't comfortably use it as a Homebrew replacement.
Also, my Mac is 256gb which feels far too cramped for Nix. I'd really only recommend it if you're working with 512gb or more.
Yeah, I eventually realized that nix-darwin is only good for managing the list of homebrew casks.
home-manager works better than I'd expect. At least I don't have to spend time configuring vim, git and shell prompt.
But overall, Nix on MacOS feels like a hack, behaves like a hack... because it's essentially a hack...
I'm waiting for the first reports of Lenovo's t14s gen 6 with Snapdragon X Elite.
> Yeah, I eventually realized that nix-darwin is only good for managing the list of homebrew casks.
If that's really all you use it for, and you already use Home Manager, the Homebrew module for Nix-Darwin works fine in Home Manager with some small tweaks. I use that with a custom, per-user Homebrew prefix to keep Homebrew out of the way except for the task of installing casks.
well asahi is going to bring M4 support eventually and there is https://github.com/tpwrules/nixos-apple-silicon
also you could just Containerize All The Things on macos idk
There's no container support on macOS unfortunately, except for the frontends for linux virtual machines.
(Well they do have a thing called containers, but it's a desktop app security/sandboxing feature that does something conceptually different)
There's an extremely experimental/feature limited 3rd party implementation of macOS native containers. It requires disabling all sorts of security features, though.
Ya like i get the love for linux, but seriously give MacOSX a try for a while, its pretty damn good lol
macOS simply doesn't work if I want to run this as a home server, which is my primary use case for an Apple silicon Mac. Most server applications are first class citizens on Linux, like Docker and Kubernetes and caddy/nginx (I know ports exist but there's more documentation and experience on Linux). Furthermore, systemd is a lot more documented than launchd and generally speaking it's easier to do things like upgrading headless, setting up NFS, and the like. I wish Apple offered these machines with official Linux support, but that's antithetical to their philosophy.
For most developer things it's slightly worse, so to have to pay extra for it is a bit silly.
Just put Asahi on there, or any linux you like. I'm sure the support will be there soon if it's not already.
Asahi Linux doesn’t currently support M4 but it’s planned for next year.
> or any Linux you like
That's not a thing. Apple silicon doesn't use EFI so you need a completely custom ROM to satisfy the boot process, hence Asahi. And Asahi doesn't support M4 and likely won't for a while.
ARM64 support for GUI apps (via flatpak in the Fedora Asahi Remix) is also pretty poor, though your standard fair of CLI apps are present.
It is a thing. Sure, it's not easy. You can't just download the Debian build for M1, but it can be built.
M4 support will come.
NUC 14th gen with i3 is around 400 EUR with VAT, with no RAM or storage. For the other 250 EUR, surely you can get more RAM than 16 GB and more storage than 256 GB.
I use a NUC as a daily driver. The problem with NUCs is that cooling is suboptimal, the fan is small and thus noisy. It can be fixed with a third-party case, but that's at least €60-100 more for a much slower machine. Plus, you may void the guarantee by transplanting the motherboard.
It’s a shame that ASUS cancelled the NUC Extreme line. I know it’s quite a bit bigger than other NUCs. But the 13 Extreme had expandability, good cooling, and fast CPU options.
I have i7 NUC13 mounted on a back of monitor and I can barely hear it. It's not that bad, previously (NUC7-era) it was much worse.
They're usually the same 30 - 50 watt processors used in x86 laptops, once we got past the "dual-core" ceiling they became extremely capable.
> NUC 14th gen with i3 is around 400 EUR with VAT
4 cores instead of 10 cores, 69W TDP instead of 22W, UHD Graphics 730 versus Apple's 10 core GPU (0.5 TFLOPs vs about 4.3), 23% worse single core performance, 45% worse multicore, and much louder cooling.
It's not a fair comparison.
So get a stronger one. There will be always something that is better on one or another side.
I have 13th gen i7, with 64 GB RAM and 2 TB ssd (and 2.5 GbE). It was 800 EUR + VAT, last year. How much would similar Mac Mini cost?
Not a fair comparison either.
Edit: 69W for the NUC is not TDP. It is 69W power brick that ships with the machine.
what do you get in https://browserbench.org/Speedometer3.0 ? - m4 mini is 45 it seems.
16.6 in Firefox, 19.8 in Chrome (Ubuntu 24.04, both browsers in flatpak).
How noisy is it?
I made a fanless NUC 7 (!) years ago with a special case, and it's perhaps due for replacement.
Still can be heard, but it is not so annoying as it was. The fan noise is, how to describe it, softer?
Comparing nuc7i7dnke to nuc13anki7.
It's almost half the price.
RAM is completely fair.
Apple is certainly out of their mind on storage. But on a desktop it’s trivial to plug-in an external disc that you can buy at an absolutely reasonable price instead of the insane Apple one.
$200 bucks for 8GB RAM extra is not fair ($200 to go to 24GB, another $200 for 32GB). 64GB DDR5 kits can be had for less than $200.
And while you can plug in storage, I think it does kind of ruin the appeal of having such a small device by having a bunch of spaghetti cables. And if the boot drive goes it's not easily replaceable and makes the machine a brick until it is.
I bought a no brand mini PC really small with older Intel CPU. It runs Debian quite well. But noisy and slows down a lot, perceivable once it gets hot. But it's cheap around $100,-150 on Chinese e-commerce websites with memory, disk.
Apple does amazing stuff. But it's very pricey in most markets and unaffordable to those on budget.
I see a lot of reviews that say things like this but seem to be written by people who aren't testing against commonly available mini PCs that are built on efficiency architectures.
How different is the efficiency of this compared to something like an Intel N100/200/300 or a Ryzen 7 7735HS that you can get in cheap mini PCs from manufacturers like Beelink?
I am not doubting that Apple's processors are class-leading but at the same time it seems like I see a lot of people impressed that a mini PC can idle under 10 watts. That's been common for a long time now.
I have an N100 in that listing on the linked GitHub project, it's the best Intel system I've tested, and it gets around 2.5 Gflops/W (which is a little less than a Raspberry Pi 5, which is not known for being the most efficient Arm system).
These auto-generated comparison pages are not useful.
All his tests are published here: https://github.com/geerlingguy/top500-benchmark?tab=readme-o... He does test some N100 machines.
TL;DR: The M4 blows everything away, but isn't a general purpose machine. The N100 is slightly better than 50% of the power efficiency of the remainder. It's also slightly faster despite having less cores. Single core speed is looks to be twice the speed of ARM.
An N100 box with 16GB of RAM and 1TB of NVME is around the USD$200 mark, which is far cheaper than the Mac or Ampere, but in line with the other ARM options. It comes in more form factors with more customisability options than you can poke a pointed stick at.
All in all, it doesn't fair too badly. The low price, fast single core speed, and compatibility with everything makes up for a lot of sins.
I'm pondering the down votes. My current theory is they are over this:
> The M4 blows everything away, but isn't a general purpose machine.
Perhaps that upset Apple fans.
My definition of "general purpose machine" is one that I can realistically write software for. I written operating systems (with a custom network and GUI stack) in the past. Doing that requires hardware that isn't locked down, and somewhat less obviously is well documented. As in "Intel Architecture Manual" style documented, or failing that at least open source drivers like the Linux i915 driver. Apple doesn't come close to meeting that criteria.
That's a shame, because their consumer hardware is stellar. If it was open, I would chose it over anything else.
A few x86 CPUs used are 4-7 generations old. Not sure what to make of that.
I do indeed appear to be the idiot who missed the link within the article that had exactly what I was looking for.
How does one “idling” like this? “ In 1.25U of rack space, you could run three Mac minis, idling around 10W, giving almost a teraflop of CPU performance. ”
A Faustian deal. Worth the pound of flesh?
But you have to realize this computer only “partially” belongs to you.
How so? They don’t prevent you installing other operating systems.
please elaborate
Best quote from the post: “If only they didn't put the power button on the bottom“