There are a ton of products on the market that are vastly more dangerous than computers: guns, cars, motorcycles, bicycles, chainsaws, table saws, cigarettes, alcohol, junk food. Yes, consumers do sometimes harm themselves by using these products. That's the price of freedom. I think it's bizarre that we treat computers as the most dangerous products in the world that for some reason demand paternalism, when none of these other products are locked down by the vendor.
The reason that computers are locked down by the vendors is not that computers are somehow more dangerous than other things we buy. The reason is simply that it's technically possible to lock down computers, and vendors have found that it's massively, MASSIVELY profitable to do so. It's all about protecting their profits, not protecting us. We know that the crApp Store is full of scams that steal literally millions of dollars from consumers, and we know that the computer vendors violate our privacy by phoning home with "analytics" covering everything we do on the devices. This is not intended for our benefit but rather for theirs.
Not only profits, but control. Remember the whole CSAM scanning debacle from Apple?
There's a difference between being able to buy something dangerous and being forced to do so
Forced? I'm not sure I understand.
My guess is that you're assuming, wrongly, that vendor locked devices are "safe" and unlocked devices are "unsafe".
All computers that are connected to the internet are unsafe in some ways. The most dangerous apps on your computer are the vendor's own built-in web browser and messaging app.
Also, the vendor-controlled software stores are unsafe cesspools. You will never find a more wretched hive of scum and villainy. Moreover, the vendors deliberately make it impossible for you to protect yourself. For example, iOS makes it difficult or impossible to inspect the file system directly, and you can't install software such as Little Snitch on iOS that stops 3rd party apps—as well as 1st party apps!—from phoning home.
In any case, most computers, including Apple computers, have parental controls and the like, so you can lock down your own device to your heart's content if you don't trust yourself, or you don't trust the family member that you're gifting the device.
Today, yes, I can lock down the iPhone I give to my son, but if it can be unlocked to run arbitrary software then he can in theory unlock it. Yes, it is on me to continue to monitor the device to make sure he hasn't done it, but the point stands
And the assumption you refer to, there are varying definitions for "safe". Is a device with a locked bootloader 100% safe in all use cases and all circumstances? Of course not. But me being able to reasonably trust that someone hasn't put a compromised version of the OS on the device, or, won't be able to put a different firmware on the device to brute force my encrypted contents is a bit of safety in a certain set of circumstances that I want in my device
If Apple, or anyone else, were precluded from locking the boot loader yes, I would be forced to buy a device that the FBI or anyone else could in theory poke around on enough to try to get at my data
> Today, yes, I can lock down the iPhone I give to my son, but if it can be unlocked to run arbitrary software then he can in theory unlock it. Yes, it is on me to continue to monitor the device to make sure he hasn't done it, but the point stands
You're scared of the wrong thing. The greater danger isn't arbitrary software but rather your son running up massive App Store charges on IAP of exploitative games and other scams. And if you think Apple will refund you, think again. Locking the device to the crApp Store isn't the solution. To the contrary, the solution is to enable parental controls to prevent access to the crApp Store.
> But me being able to reasonably trust that someone hasn't put a compromised version of the OS on the device, or, won't be able to put a different firmware on the device to brute force my encrypted contents is a bit of safety in a certain set of circumstances that I want in my device
These are possible without vendor lockdown. Devices can be and are designed so that the consumer can lock the device down and prevent modification, etc. Of course you can't constrain yourself, if you have the credentials to unlock the device, but you can constrain everyone else, whether they're children on the one hand or thieves/attackers on the other.
This is a very popular HN opinion; but not a very popular real world opinion.
The average customer wants a device that works consistently, every day, that is easy to use, with a collection of 3rd party apps who won’t steal their life savings.
Windows failed to deliver this; the average customer never downloads an Exe from a newer publisher without terror. The average consumer is literally dozens of times more likely to trust a new smartphone app than a new desktop app.
We can also see this in the console market. Windows exists; old gaming PCs exist; the locked down console market will be with us forever because even Windows can’t deliver a simple experience that reliably works.
The average customer wants a car that doesn't explode because you installed a sketchy spark plug. Does that mean the manufacturers should install locks on the hood of every new car, with the threat of jail time if you pick the lock and look underneath?
A sketchy spark plug does not have the ability to make a car explode, so the analogy is pointless.
On that note, even if someone stole your car, at least your car does not have access to your bank account, your passwords, your messages, and even your sexual history. The personal and reputational cost of losing a car is not comparable.
Many people would actually probably prefer their car to be stolen than the contents of their phone be public.
I think a more accurate comparison would be to an electrician. In Australia, doing your own electrical work is a crime even for the homeowner, because it can cause physical death, and is too likely to be done wrong. Yes, you will possibly go to jail for replacing $2 light switches. I assure you that most people’s phones have things they would prefer physical death over being publicly distributed.
> On that note, even if someone stole your car, at least your car does not have access to your bank account, your passwords, your messages, and even your sexual history. The personal and reputational cost of losing a car is not comparable.
You're conflating vendor lockdown with device encryption. The latter does not require the former.
And consumers can have that. That doesn't mean I should be unable to unlock my phone and do whatever I want with it.
The problem is not the ability to unlock your phone.
The problem is that 90% of people unlocking their phones will either be for piracy (against the company’s interests), or against the customer's own interests (stalkerware, data extraction, sale of stolen devices).
There is a reason malware is over 50 times as prevalent on Android.
Having worked on catching Android malware, I can assure you that Android malware does not proliferate because people can unlock their phones.
Given that the vast majority of Android devices aren't rooted, bootloader unlocked, or even installing apps from outside the store(s) that they ship with, what exactly do you think is the reason for more malware on Android? (Taking the claim at face value)
Why do I give a shit about the company? I bought the phone, it's mine, I should be able to unlock it. If I catch malware, I'm an adult and I'll live with my choices.
> There is a reason malware is over 50 times as prevalent on Android.
What's the reason for that bogus-sounding statistic?
> with a collection of 3rd party apps who won’t steal their life savings.
This is blatant unempirical scare mongering. How many desktop computer users have had their life savings stolen by 3rd party apps? Citation needed.
> The average consumer is literally dozens of times more likely to trust a new smartphone app than a new desktop app.
This is a false dichotomy. Almost all desktop computer users have a smartphone too. The people who have enough disposable income buy both smartphones and desktop computers. There's no inherent conflict between the two.
> the locked down console market will be with us forever because even Windows can’t deliver a simple experience that reliably works.
That's a competely ahistorical interpretation. Originally, the gaming consoles had no third-party games: the games were all written by the vendors. The first third-party game development company was Activision, a group of former Atari programmers who learned that their games were responsible for most of Atari's revenue, but Atari refused to give them a cut, so they left and formed their own company. There was a lawsuit, and it was ultimately settled, allowing Atari to get a cut of Activision while allowing Activision to otherwise continue developing console games. It had nothing to do with "reliablity" or "security" or any kind of made-up excuse like that.
> This is blatant unempirical scare mongering. How many desktop computer users have had their life savings stolen by 3rd party apps?
You’re kidding, right? You seem to have completely forgotten, or put the drunk glasses, on what living in the 2000s was like. Also, I don’t care that you made it, don’t let your survival bias hit you on the way out.
> Originally, the gaming consoles had no third-party games: the games were all written by the vendors
What a stereotypical HN comment. Cite something that only applied to the 2nd generation of consoles to prove me wrong, even though my point spans almost all console generations.
> You’re kidding, right? You seem to have completely forgotten, or put the drunk glasses, on what living in the 2000s was like.
Again, citation needed. I made it through the 2000s just fine, thank you.
> What a stereotypical HN comment. Cite something that only applied to the 2nd generation of consoles to prove me wrong, even though my point spans almost all console generations.
No, I was explaining the historical origin of the game console business model. Of course the business model continued, as these things usually do, through a combination of monetary incentives and inertia.
Of course. As we all know here, any business that gets started will go on forever regardless of market fit.
This is a silly criticism. After all, as we all know here (right?), Atari itself fell on hard times. I was talking about the business model, not a specific business. Vendor lockdown and taking a cut of 3rd party software is clearly quite lucrative for vendors, and that's why they do it. There's of course no guarantee of success, but it's obvious why other vendors have emulated that business model.
It may be only for historical reasons that desktop computers aren't completely locked down too. It's a lot easier to lock down a new device class, like smartphones, than it is to lock down an existing open device class, without causing consumer outrage and rebellion.
that's a, frankly, stupid argument. the conclusion doesn't follow the premise.
then don't root your phone or download an .exe. having the ability to do something doesn't mean you are forced to do it.
not safe enough for you? fine! make the current status quo comfortable walled-garden-of-illusionary-fake-safety the default. for example, there's no reason windows needs to by default allow unsigned code to run. hell, even make it really annoying to turn off.
but the "safety" and "easy to use" arguments against right-to-repair, digital rights, ownership, etc. is simply nonsense. there is literally ZERO negative safety or usability impact to anyone else's device because i'd like to own mine.
it's also an insulting and disingenuous argument to hear anyone on this forum make: our careers and entire segment of the economy would not exist if it were not for open systems. and it's insulting to basically say "bubba/granny is too dumb to be trusted" with owning their own device.
I detest Google, but I do think they made the right call with Android devices and Chromebooks. You can unlock either as long as you are willing to totally wipe the device first and start over as a new device under a new security context.
This removes the risk of this being abused to compromise the data of stolen devices or evil maid attacks unless a user that knows what they are doing has explicitly opted themselves into that risk.
I contacted the Google through the BBB. Made the statement that lack of ability to install and configure a Kernel level firewall, edit the HOSTS file, and remove unwanted bloat-ware reduces the security of the product. Google agreed their actions do this and said they find the lack of security acceptable. Having a firewall like Little Snitch should be acceptable to know where the phone is communicate, with whom, and how to prevent it.
Re-imaging with a rooted image is not acceptable because this also reduces the device's security by prevent OTA updates!
Gated community is broken when the end user cannot improve the security of the device above and beyond the lack polices of Google and Apple. For instant there should be no reason my device ever communicates with organizations I do not support such as Facebook or X-Twitter. X-Twitter is often used as command and control service in plain site.
It is not just out-wards communicate to monitor but in-wards too. I've used Zone Alarm in the past at an international company to help find the infected servers and computers that where serving up viruses and other malware.
*I would argue that the "Gated Community" analogy is flawed. A real world gated community still allows for the home owner to improve the security. By installing cameras, security system, and guards. Apple & Google prevent such actions.
This, or even sell "dev units" with the bootloader unlocked so that you explicitly have to accept the risk before purchasing the device.
The problem though is that rooting by itself is not that useful when a lot of apps use remote attestation to deny you service if you're rooted.
We don't just need root access, we need undetectable root access.
> We don't just need root access, we need undetectable root access.
At some point the argument morphs from 'I should be able to do whatever I want with my device' to 'I should be able to access your service/device with whatever I want'.
The fact that Google allows this shows that
1. Apple could do it with zero security impact on anyone who doesn't opt in
2. They could keep any service-based profit source intact
But they still would never do it. Because it's not only service based profit they want to protect. They want to restrict customers from running competitor's software on their hardware, to ensure they get their cut.
> At some point the argument morphs from 'I should be able to do whatever I want with my device' to 'I should be able to access your service/device with whatever I want'.
I'm not demanding to be able to log in to your service/device and replace IIS with Apache on it. I'm just demanding to be able to access it as a normal user with Firefox instead of Chrome.
I'm not saying you shouldn't be able to access from unlocked devices. I'm just saying it's a different argument.
Agreed, that's a good solution. I can root my phone immediately when I buy it, or I can leave it locked if that's my choice. That's the best of both worlds.
I would argue that the best of both worlds is being able to add your own keys and then relock the bootloader. Which Pixel devices also do:) Not sure about Chromebooks; I kinda think you maybe could reflash the firmware and then put back the write-protect screw?
Fully agreed. I was thinking of something similar, only I was calling it "Right to execute", similarly to the "Right to repair". I'm buying a general computing device. It's ridiculous I'm artificially limited in using it for the main purpose of making shareholders rich.
Ideally I'd add a mandatory toolchain to that. At least a C compiler which should be able to target a device I own.
If locking the bootloader and comparing signatures against keys burned into a secure enclave allow Apple to make certain security guarantees that helps them sell products, I'm all for their freedom to do so.
Why doesn't OP merely champion competition, instead of encouraging regulation of what software others can write, what hardware others can ship?
I too am afraid of general purpose computing going by the wayside, and I have the Precursor phone and Raptor Talos PowerPC machines on my wishlist, just as soon as I wrap my head around secure boot chains in general before having to implement one myself. But niche hardware is expensive to produce, so we're likely left with what AMD, Intel and Apple provides us.
I guess one quirk that IMO is fair to criticize is that it's not necessarily consumers who are demanding to be locked out of their administrator privileges (the average computer user is of course not aware of the distinction of signed vs unsigned binaries), so I don't know where the pressure for secure enclaves really comes from. Is it the data centers buying thousands of chips that don't want to be pwned? government customers who refuse to buy a single die if they can't verify the bootloader? Or just patriotic engineers sensitive to a cybersecurity regime that demands we keep our guard up against enemies, foreign and domestic?
> so I don't know where the pressure for secure enclaves really comes from.
The pressure comes from shareholders. User control means users can use their device in a way that benefits them, e.g. blocking invasive tracking. This benefit provides zero or negative shareholder value.
> Why doesn't OP merely champion competition, instead of encouraging regulation of what software others can write, what hardware others can ship?
This is false dichotomy. Why not both?
We need individual consumer rights, and we also need healthy market competition.
I used to think this way but then I saw how non-techy people use their devices.
Something like this would inevitably be abused and result in wave of malware so massive that it would render the internet too hostile for all but the most careful, knowledgable and paranoid users.
"everyone is too stupid to be trusted with general purpose computers" is a pretty grim position to hold
You left out the vital clause. "... if they have total unfettered control". Also, not everyone. Obviously.
It's a position I came to rather regretfully and sadly.
A grim position that is completely accurate for over 90% of people.
As the author mentions, desktop computers have always been (mostly) unlocked. This comment is just over the top scaremongering.
Even with access controls, people do things like download chrome from random web sites, then do their banking with the result. If the fake-chrome requested admin access then you'd never be able trust anything on that computer ever again. Even re-installing the OS wouldn't fix it.
It would no longer be your computer.
So because it would no longer be our computer, we should buy one that's not ours from the start?
I guess, pretty much. For the vast majority of computer users, all we can do is buy it from someone we mostly trust (to be competent and trustworthy).
Pretty sad state of affairs, huh?
No, I don't think that's true. We got used to insecure systems, and then accepted Big Brother as a security model. We can have secure devices that aren't owned by a corporation, but judging by the comments section here, nobody knows that.
How might that work? You personally have the keys to the TPM? Then some confidence trickster will tell a naive user that to make big$buck$ on the internet you'll need to handover your TPM key. And people will.
Why would you need keys to a TPM? It's meant to store keys without ever getting them out.
> If the fake-chrome requested admin access
If the user is doing banking on fake-chrome then admin is pointless; https://xkcd.com/1200/
> then you'd never be able trust anything on that computer ever again. Even re-installing the OS wouldn't fix it.
Why not? If the hardware is under user control, they just reimage the firmware, reimage the OS, and then it's clean. (Or, in practice, perhaps they hire the local computer shop to do so, but I don't think that changes anything.)
Yep, and if you connected a Windows95 machine to the internet it would be compromised within 15 minutes.
Stupidest take I've seen today.
There are already myriad unlockable devices.
What a bizarre fantasy you have constructed.
Let me clarify .. if you have an unlocked device, then software vendors should be able to ensure that their software is non-functional on such a device.
Given that, then anything very useful would be rendered non-functional, resulting in the device probably being useless.
Why should a software vendor be allowed to say what I can and can't run on my machine?
not sure if your in the US, but we can't even get net neutrality. Unfortuantely the likelyhood of this is a hell freezing over situation.
I would start with, laws should be logical and informed and go from there... the number of prerequisite changes required to come mildly close to this is unreal. Including but not limited too: copyright law, insurance law, patents, contract law, federal vs state law, an agency competent enough to enforce this, lobby from the most powerful companies in the world, and more.
In dream land I support you though.
California is typically pro consumer, tenant, employee.
If it could be passed in California it would trickle down elsewhere.
e.g. California emissions mandates leading to less emissions in the entire country because it makes more sense at scale to build 1 SKU.
I agree states can more quickly change laws, but thats a far cry from what's proposed here. Can you imagine enforcement of such a law? Can you imagine a state employee checking for root access? You break at least five California laws an hour there just walking around as the beauracratic machine consumes any reasonable enforcement. This would be considered absolutely niche and no politician would likely care because it wont get them reelected. Call me cynical I guess but our track record of change for the better is pretty poor in this arena.
In my dream we abolish copyright all together but alas we live in a profit oriented society not a knowledge oriented one.
> I believe consumers, as a right, should be able to install software of their choosing to any computing device that is owned outright.
Manufacturers will then claim that people don't own devices, merely a perpetual license to use it.
> Manufacturers will then claim that people don't own devices, merely a perpetual license to use it.
It would be refreshing for them to be honest about their monetary greed instead of telling false stories about "security" and what's "good for users".
That's a slippery slope, because they list devices as sold, not as rented. So they can't claim that. Some still try, especially using copyright on software on the devices as leverage.
... plus the license to smash those devices with a hammer :)
Contrarian take: you bought the device, that you knew already did not provide that, from a company who has priced in not having to support rooted devices, and who had priced in your future revenue from extras. The company can't complain if you find a way to root it (and they don't), but they're under no obligation to add in this extra feature you're asking for. If you want a mostly-open handheld device, they're for sale, you should buy one of those.
Every time this issue comes up, an army of people who've never unlocked a phone comes out of the woodwork to talk about their theoretical fears of malware and piracy and rain falling from the sky if you dare to own your device.
If you've never unlocked a phone, please educate yourself on how the process works before opining. It's really not as terrifying as you imagine.
Author seems to want 'hardware-level locks' regulated, i.e., allow the government to get into your encrypted devices.
That is absolutely not what the author is saying here, just that users should have the ability to install their own software on their own hardware, and that locked bootloaders and the like should not be allowed.
Yes, that is exactly what the author is saying, wanting government regulation. For one, the government won't simply say "keep the device open for the end user" (or a "smart" government wouldn't), but even if they did, you've now opened that device to any attacker, law enforcement included.
So because I can sudo on my computer, my computer is open to any attacker? What's wrong with this comments section? Has anyone here used a computer before?
I want to install my own software on my smartphone. However, I don't want others with physical access to be able to do this... here we hit a problem, because if the device allows extracting the data, bruteforce becomes feasible..
Also I don't want others to be able to use my phone after stealing it.. here FRP lock helps me but in order for it to work it must also limit how I can use the phone.
I wish we stopped falling for this technical trap and finally focus on the substance - hardware/software companies get away with anti-user features that run on the user's device. This shouldn't be allowed.
I don't need an unlocked bootloader, I just don't want the preinstalled Google spyware. Google should not be allowed to hold my device hostage like this.
> Root access refers to the highest level of privileges a user can be granted to a computer system.
This is no longer true. You might have root access on your smartphone, but you still don't have access to the TEE (on ARM this is implemented using the "TrustZone" "feature").
Also, AVF is coming to Android, and protected VMs won't work with unlocked bootloader.. so expect the situation to deteriorate further once manufacturers make use of pVMs..
Yeah I mean theres not even any way to really know what your communicating with without taking the thing apart and following the traces + an ocilliscope. If it was worth obfuscating to a determined company it would be hell to figure out.
You could just be a sandbox root which is pointed at a guest user in a higher namespace.
The TEE is where both the device encryption keys (not DRM) and Widevine keys (DRM) are stored..
So basically a TDM?
There's not nearly enough awareness of this. Even with root access, on modern Android you have to set up a virtual USB connection just to get at the files in the data folders of android apps if you want to, for example, sync the savegames between your mobile emulators and your desktop emulators. It's fucking disgusting. With every new edition they shave off a little more user agency.
What do you mean "a virtual USB connection"? With root access I can see all the files on my Android phone.
It's sad how Google has perverted what Android was meant to be.
For some, the absolute locked down-ness is a selling point. Why should those who want to buy something that can't be messed with not be able to?
If you don't want to buy something you can't install whatever you want onto, don't buy it. 100% the ability or inability to modify the firmware of a device should be disclosed, but if it's disclosed the seller should be able to set the policy to whatever they want
This is an extremely weak argument, and I'd like to stop seeing it perpetuated. If you don't want an unlocked bootloader, just don't unlock your bootloader. Why should we remove the ability to unlock the bootloader entirely just because some people don't want to use it?
Because the fact that it can't be unlocked makes me reasonably reassured that I can trust the software running on it comes from the vendor of the device
It's the same reason I don't want "the good guys" to have decryption keys to my messaging service, because even if I did trust the FBI, the fact that there is a backdoor at all means it could be exploited by someone I don't trust
Again, if you don't want to use a device that has a locked bootloader, don't buy it. I fail to see how this business model should be legally foreclosed upon. You'll always have the option to buy a device that can be unlocked, someone will always sell such a device. But if you can't lock them, then I can't buy one even if I want to
Phones with unlockable bootloaders aren't going to be sold for much longer just like dumb TVs aren't sold anymore. There's just too much profit to be earned by corporations locking devices, plus banks and governments want to lock down phones. And once they lock down phones they'll go for desktops as well.
Dumb TVs are still sold, they just cost more. Same will probably be true for the low volume, no-stolen-data (or no-apple-tax) unlockable phones, too
Maybe in the US, but not in my country. I tried looking for "signage displays" but all I could find was Samsung professional monitors that still had the smart stuff
Yeah, this is just a fundamental misunderstanding of how bootloader unlocking works. The people repeating this argument seem to think that their bootloader will unlock if they look at their phone wrong, when in reality the bootloader unlock process can be made such that the user must consent. If some malware can bypass that, then it could bypass your bootloader in the first place.
It's not just about malware you might accidentally download, it's also about adversaries that may have physical access to your device and can provide that consent
No matter how convoluted you make the rube goldberg machine to bypass the cryptography, if there's a way to bypass it it will be bypassed
There are ways to do it so that 'bypass' means you effectively wipe the device. If that's not good enough, how do you protect against them just replacing your device with a compromised one that looks similar?
So because we can't eliminate every possibility, we should just give up on all protections?
I think you are misunderstanding my point. You aren't giving anything up by enabling unlocking if the act of unlocking wipes the device.
Please detail this attack vector where someone can compromise a phone with an unlockable bootloader but not one you can't unlock.
That's not what I claimed?
> it's also about adversaries that may have physical access to your device and can provide that consent. No matter how convoluted you make the rube goldberg machine to bypass the cryptography, if there's a way to bypass it it will be bypassed
You claimed that an adversary with physical access to your device can compromise your unlockable phone, but presumably this won't happen with a phone that can't be unlocked. Is that not what you claim? If so, please detail how.
I was talking about a device with an unlockable bootloader, not one that cannot be unlocked
Wanting an uncompromisable bootloader is about more than just protection against malware that might modify the software on the device, it's about protecting a phone that can be unlocked from having the software modified by someone with the ability to provide the consent that the end-user would normally give. For example when I hand my phone over in customs, or if it's seized by the police. If my bootloader is not unlockable, I haven't provided them with the keys to unlock the software, and those keys are reasonably strong, then I can be reasonably confident they haven't compromised by device
But, if they can unlock the bootloader for whatever reason, I have no idea now what is running on the device or what was run on it even if they restore it back to a locked condition
Every device I've ever unlocked warns you on boot that it's unlocked. So if that's your threat model, just reboot the phone after the maid hands it back to you and see if you get a scary warning.
> If you don't want an unlocked bootloader, just don't unlock your bootloader.
That kind of logic cuts both ways: "If you don't want a device with a locked boot loader, just don't buy a device with a locked bootloader".
Unfortunately, as consumers, we're trapped between a rock and a hard place. On the one hand, I would want 100% freedom to use my device exactly as I see fit and run any software I want, without any form of curation from the manufacturer. On the other hand, there are plenty of software companies who do shitty things when given absolute freedom over what to do in a user's device (tracking / spying / etc) and I welcome buying a device where the manufacturer helps me fight some of that.
So I can absolutely see both arguments. And I think both types can coexist. I am happy my iPhone doesn't allow Meta to say "to use WhatsApp, you must install the MetaStore®, give it root and install it from there". I would not be happy with those restrictions on my desktop.
> I am happy my iPhone doesn't allow Meta to say "to use WhatsApp, you must install the MetaStore®, give it root and install it from there".
I think the inverse is a much more credible threat, though. "Sorry, you cant sign in to your bank because you are using Linux. Please try again on windows 11 with secure boot turned on" doesn't seem far fetched at all.
I would also be happy with those restrictions on a traditional PC-class computing device (laptop or desktop). Would I personally buy one? Probably not, but I'd feel a whole hell of a lot better if my non-techie wife or mother or brother were using one and they were no more susceptible to some kind of exploit on their PC device than they were on their phone
That's the whole thing--there should be choice
I could see Microsoft saying "we're only allowing apps installed through our 'store', for safety/security reasons, unless you opt out (gated by some scary warning that doing so is unsafe).
Even if they never charged a fee for running the store, I bet this would raise a lot of eyebrows.
Microsoft has been going in the opposite direction. Nowadays you can post any win32 app to the store, they are loosening not tightening
I haven't used it in ages, but last I looked their store was entirely pointless. I wouldn't be surprised if they just got rid of it altogether.
My point was more about how people would react if MS did such a thing (i.e. installs to come from the store by default).
> I am happy my iPhone doesn't allow Meta to say "to use WhatsApp, you must install the MetaStore®, give it root and install it from there". I would not be happy with those restrictions on my desktop.
You fix that by making root access inconvenient enough that companies can't rely on the average random user having it enabled.
For example force you to wipe the device to unlock it as another person said in another comment. Or make it so that if you don't unlock it within 7 days of the device purchase and first boot, you cannot unlock it anymore.
Those restrictions aren't on your desktop, where you do have root. Why would they be on your phone if you had root on that?
> And I think both types can coexist.
E.g. macos