I think the argument is interesting, but the specific example of prop 65 doesn't really work on a few levels. The argument in the post is that Prop 65's warnings are legitimate in some sense, but only apply in specific contexts.
However, Prop 65 is much broader than that. To qualify, a chemical just needs to show up on one of maybe half a dozen lists that show the chemical has some association w/ cancer, but all these show is that in some study, at some quantity, the association existed. The amount that was linked to cancer could be far beyond what is ever present in a consumer good, and the links could have only been shown in non-humans.
The lists aren't the ones gov't agencies like the FDA use to regulate product safety, they're lists far upstream of that that research institutions use to inform further study. The typical starting point is a mouse study with a huge dosage. It's not a useless study, but it's not meant to inform what a human should/should not consume, it's just the start of an investigation.
I don't think this actually has any bearing on the substance of the broader argument, but Prop 65 is not the best example.
I don't know if I would even call this clickbait but this is not an argument against transparency. It's an argument against poor regulations. I would argue Prop 65 is the opposite of transparency because just about everything causes cancer so people have learned to ignore the warning. It was a law that was passed in a time when we didn't have as much information as we do now and it should be updated and made more specific.
> You know what would be better than a privacy policy? A privacy law.
I agree but I wouldn't call privacy policies transparent. They are made of vague legal speak like "we may or may not share your information with advertisers and partners." There are good arguments in here but they are framed against the wrong target.
The framing being used is that what we currently do is "pro-transparency." We make laws to "inform" consumers and then trust that the market will sort the rest out. Cory rejects this as a workable tactic, because transparency, even real, full transparency, just becomes noise that people filter out when making decisions. He argues that if you want good outcomes you need legislation other than forcing transparency.
The flip way to argue that is that one way to get good legislation is that some level of transparency is in place so that people can make informed opinions on what is good.
I don't think I disagree with the conclusion but my point is that we don't have real transparency and a lot of these transparency laws actually obscure information to confuse the consumer. So I guess the issue I'm taking here is that these laws he is attacking aren't real transparency.
"Privacy policy: we don't collect or retain any data at all ever period."
You don't keep server logs? Cool and all, but it sounds like you'll have a hard time debugging if something ever goes wonky.
If your server logs contain personal information then you are doing something horribly wrong and I hope you don't operate in the EU.
Don't log sensitive data. You don't need that for debugging.
that's probably translated to the following is the problem: "Privacy policy: we're just gonna lie about it because our lawyers don't think there's consequences"
No mention of the GDPR.