• metadat 2 days ago

    How is a fine of $100m USD calculated when there were no actual reported damages?

    I'm not sympathetic to Meta making security mistakes, more curious how the punishment was decided, in lieu of causing any problems.

    I wonder if it was a poorly thought out request log line or what.

    • rsynnott 2 days ago

      It’s a GDPR fine, so the max would be 4% global revenue, or a little over 5bn for Facebook. So, on the minor end of things, really, scaled for Facebook; Facebook has had much bigger GDPR fines.

      If there were damages, that’d be dealt with separately by the courts (EU countries don’t do punitive damages, mostly; this sort of regulator action is used for punishment instead).

    • Rygian 2 days ago

      I really agree with the backdrop painted by this article:

      - Meta discovered the discovered internally.

      - Meta fixed the issue without delay.

      - Meta took steps to show "absence of evidence" of abuse. (Does not mean "evidence of absence" though.)

      - The reuters article says "Issue was disclosed voluntarily to the regulator." but the actual source [1] announces a breach of GDPR Article 33(1), for failing to notify.

      - Meta was still fined 91 M€ for failing to build "data protection by design and by default" (my understanding of the fine, Articles 5 and 32 of the GDPR).

      This is a positive step for security: companies being fined for being sloppy about security, even if they dutifully clean up after they mess up.

      [1] https://www.dataprotection.ie/en/news-media/press-releases/D...

      • kaoD 2 days ago

        I disagree. This only encourages companies not to disclose such issues.

        • organsnyder 2 days ago

          The penalties would undoubtedly been much higher if they didn't disclose it. Of course, perhaps it never would have been discovered, but whistleblower protections/incentives and high enough penalties for covering up issues can set the right balance.

          It's similar to just about any other violation, really: if I injure someone accidentally—even through negligence—I'm going to get a much more lenient punishment if I don't try to cover it up or run away from it.

          • donatj 2 days ago

            Who in their right mind is going whistle blow and risk their entire career over a security flaw that was detected internally, found to be unutilized, and was fixed in a timely fashion?

            The fact that such a case even has reporting requirements at all seems nuts to me.

            • orf 2 days ago

              Good - then the person who sees that they didn’t report it can whistleblow on that and get a nice paycheck.

              See how that works out for the person who didn’t report it.

              • xvector 2 days ago

                I am shocked to see the "let's make writing vulnerable code illegal" take be so popular on HN. If you have written any meaningful amount of code, you have written vulnerable code.

                • hifromwork 2 days ago

                  Writing vulnerable code is not illegal, negligence is.

                  • xvector 2 days ago

                    Every case I've seen in my career where this has happened has not been "negligence" but developers not realizing there's some obscure logging middleware or something.

                    • pwnna 2 days ago

                      Devil's advocate: my bridge fell down because I didn't know the concrete didn't meet spec still seem like negligence?

                  • orf 2 days ago

                    Cool story I guess, but that’s not related to anything I said.

                • undefined 2 days ago
                  [deleted]
                  • s_dev 2 days ago

                    An employee who left for another job or simply retired and who feels this was wrong. Plenty of lads in Meta earn enough to buy a house and have some investments that there is little leverage over them to ruin their careers.

                  • xvector 2 days ago

                    SWE in security here. Why the heck would I "whistleblow" in a scenario where a vulnerability was internally found, unused, reported to legal, and fixed? That is part of any healthy SDLC.

                    The EU is implying that it is illegal to accidentally write vulnerable code. Pure insanity, nearly every software company would go out of business overnight if this was a stance they actually enforced.

                    • ahoka 2 days ago

                      “nearly every software company would go out of business overnight if this was a stance they actually enforced”

                      For the better, if your attitude is the “healthy SDLC”.

                      • xvector 2 days ago

                        I'm sure we've literally never written a vulnerable line of code in our lives, right?

                        Security reviews are part of a healthy SDLC. You catch vulnerabilities as part of security reviews as they would be totally unnecessary if people simply wrote perfect code to begin with.

                      • Buttons840 2 days ago

                        Ideally because the law requires reporting vulnerabilities, and includes criminal penalties for those who knowingly hide vulnerabilities.

                        • xvector 2 days ago

                          When I worked in tech, we reported the vulnerabilities internally and pass them off to legal. Taking that to the government was legal's job.

                          I am not gonna go out of my way to "whistleblow on vulnerabilities to the EU" after I have done my job and reported everything to legal.

                    • randoomed 2 days ago

                      There needs to be some kind of punishment for failing to take basic security practise into account. A system where a simple disclosure is enough will probably result in company ignoring security, then when there is a problem they disclose and go on without change.

                      But, it is also important for the fines to be reduced when taking the right steps to improve. Balancing this will probably be quite difficult.

                      • Buttons840 2 days ago

                        Make a law protecting whistleblowers. Include criminal penalties for the worst cases.

                        What executive is going to brush something under the rug when they know their employees can whistle blow and if so, the executive will go to jail.

                        • xvector 2 days ago

                          Executives can already go to jail for not reporting vulnerabilities. The risk of personal criminal liability is one of the reasons people choose not to move into the Director role, in big tech security careers.

                          • donalhunt 2 days ago

                            They mostly just avoid the countries where they could get arrested in my experience.

                            • agsnu a day ago

                              Like the USA?

                      • thirdtruck 2 days ago

                        This is why we need regular government inspections that will make such disclosures inevitable. It's kind of insane that we don't already have an EPA (Environmental Protection Agency) of citizen data management.

                        • OptionX 2 days ago

                          Only true if the punishment for both is the same.

                          If they come forward they should be punished more lightly, but if not at all it only encourage "we'll just apologize latter" sort of thinking.

                          • im3w1l 2 days ago

                            There is also no real benefit to storing passwords in plaintext so I don't think your fear is realistic at all. If you are going to fine this at all, then 10k-100k would be an appropriate amount.

                            • Timon3 2 days ago

                              Why 10k-100k, how did you arrive at that amount?

                              Meta put their customers at risk through negligent actions. A fine in the range you propose would be lower than any investment required to improve security (e.g. by hiring a single additional person). What company in their right mind would do anything to improve security in that case?

                              • im3w1l 2 days ago

                                That's close to how I arrived at the number - I used programmer wages. Reading about best practices around password storage and implementing it is fairly quick and easy, so a fine of that size will still be sufficient incentive.

                                • Timon3 2 days ago

                                  But why would that be an appropriate amount? By having the fine around the lowest possible investment to tackle the issue, you're literally incentivizing companies not to take security seriously. After all, you can save money early into development, and just fix it whenever you have time - you'll still save money compared to doing things right immediately.

                                  So again, why would this make sense?

                              • rsynnott 2 days ago

                                That seems like pretty much a license to have practices as poor as you want. For a company of that size, a fine that size would just not be material at all.

                              • rsynnott 2 days ago

                                They would have been punished more lightly if they’d come forward within 72 hours; as it is it appears to have taken two months.

                              • Rygian 2 days ago

                                That's where certification (Articles 42 and 43) will come in useful.

                                • undefined 2 days ago
                                  [deleted]
                                • s_dev 2 days ago

                                  >for inadvertently storing some users' passwords without protection or encryption.

                                  The egregious nature of the issue seems to undermine any measures they may have done to retroactively fix it.

                                  A first year CS student could tell you this is a fundamentally bad idea. For a FANG company this is inexcusable. Fine is justified.

                                  A company of such size will have to disclose it purely because if an employee left and blew a whistle -- the fine would have been much more. They cut their losses and will accept responsibility.

                                  • Buttons840 2 days ago

                                    I complained at a US based company I worked for after discovering plain text passwords, nobody seemed to care, including the other programmers. I complained louder and we half-fixed it by removing the plain text passwords from the test database every person in the company had access too, but the plain text passwords were still used in production. There were millions of them, all US customers; if you're someone who eats fast food in the US, there's a chance your password was in there.

                                    Everyone was really busy working on the new layout our UI designer had come up with, so nobody gave a shit about the plain text passwords.

                                    I can only guess they're still doing this, but don't know for sure because I was fired a little while later for being a poor culture fit. They don't do business in EU.

                                    • spongebobstoes 2 days ago

                                      It was probably accidentally logged somewhere in a rare circumstance, not by design in the actual password database. These companies are not quite that incompetent.

                                  • lmkg 2 days ago

                                    > - The reuters article says "Issue was disclosed voluntarily to the regulator." but the actual source [1] announces a breach of GDPR Article 33(1), for failing to notify.

                                    GDPR mandates the regulator be informed within 72 hours of the breach being discovered.

                                    The official link you provided confirms that Meta informed the regulator voluntarily, in March of 2019. That page also includes a link to Facebook's press release, which says they discovered the issue in January. That's a time lag of around two months, which is around two months longer than the law permits. So yes, failure to follow the law mandating notification.

                                    All things considered this is a small fine. Three orders of magnitude smaller than their usual GDPR fines.

                                    • alex1138 2 days ago

                                      My general problem in all of this is Zuckerberg's reputation and trying to fit that into the character of FB (sigh okay fine "Meta") - "if you need any info on people just ask, they trust me, dumb fucks" - and storing passwords elsewhere https://news.ycombinator.com/item?id=19453359 and buying up of competition and then lying about it (violating the stipulation from Whatsapp to not share data with Facebook)