• CraigJPerry an hour ago

    Should this be facilitated? Should this work really be done? I’m thinking not.

    If I have a special case and I need to do this today, I’m not blocked from doing so (I’ll vendor one of the dependencies and change the name) - certainly a pain to do, especially if I have to go change imports in a c module as part of the dep but achievable and not a blocker for a source available dependency.

    However, if this becomes easily possible, well why shouldn’t I use it?

    The net result is MORE complexity in python packaging. More overheads for infra tools to accomadate.

    • baq 14 minutes ago

      You’ve never found yourself in a dependency resolution situation where there are no solutions to requirements. You need multiple versions of the same package in such cases.

      The alternative is just cheating: ignore requirements, install packages and hope for the best. Alas hope is not a process.

    • andrewchambers 3 hours ago

      I feel like most of these problems just disappear if people would follow the same naming scheme sqlite3 uses. Just put the major version in the name and most tools just work out of the box with multiple versions.

      • the_mitsuhiko 3 hours ago

        That’s why I called the library “jinja2”. It was a new major version. But people over the years really did not like it and it did not catch on much.

        • rbanffy 31 minutes ago

          Really, people should just update their software to use the newer libraries and fix whatever breaks. If you want to use functions from version 2, you should port the rest of the code to version 2.

          • TZubiri 2 hours ago

            Agree. The issue was breaking bc by releasing a new major under the same name.

            • smitty1e 2 hours ago

              Once more, the wisdom of "Explicit is better than implicit" shines.

              Instead, we jump through hoops with our hair on fire to manage complexity.

              People.

            • benrutter 2 hours ago

              I think this works ok if your library is something like Django or Pandas that people are building their project around. But it makes things exponentially more complex for libraries like pyarrow or fsspec that are normally subdependencies.

              Imagine trying to do things like import pyarrow14, then if that failed try pyarrow13, etc. Additionally, python doesn't have a good way of saying "I need one of the following different libraries as a dependency"

              • andrewchambers an hour ago

                I like the way python handled this situation - python3 for when you care, python for when you don't.

              • bobnamob 2 hours ago

                Of course coming with the _major_ caveat that the interface and behaviour[1] of the dependency is absolutely stable for the entirety of the "major version".

                I'm an advocate for this style of library/dependency development, unfortunately in my experience the average dependency doesn't have the discipline to pull it off.

                [1] https://www.hyrumslaw.com

                • greener_grass 3 hours ago

                  This is Rich Hickey's suggestion too

                  https://www.youtube.com/watch?v=oyLBGkS5ICk

                • rbanffy 28 minutes ago

                  One of the reasons I made pip-chill is to create an incentive to not bother with version numbers and just make your software always run against the latest versions. If something breaks, fix it. If it's too hard to do it, maybe you depend on too many things. Leftpad feelings here.

                  Having your software depend on two different versions of a library is just asking for more pain.

                  BTW, I still need to fix it to run on 3.12+ in a neat way. For now, it runs, but I don't like it.

                  • orf 3 hours ago

                    This comes up every now and again, and there are two fairly simple examples that I think show the complexities:

                    One:

                    Library A takes a callback function, and catches “request.HttpError” when invoking that callback.

                    The callback throws an exception from a differing version of “request”, which is missing an attribute that the exception handler requires.

                    What happens? How?

                    Two:

                    Library A has a function that returns a “request.Response” object.

                    Library B has a function that accepts a “request.Response” object, and performs “isinstance”/type equality on the object.

                    Library A and library B have differing and incompatible dependencies on “request”.

                    What version of the request object is sent to library B from library A, and how does “isinstance”/“type” interact with it?

                    Both of these resolve around class identities. In Python they are intrinsically linked to the defining module. Either you break this invariant and have two incompatible/different types have the same identity and introduce all kinds of bugs, or you don’t and also introduce all kinds of bugs - “yes, this is a request.Response object, but this method doesn’t exist on this request.Response object”, or “yes this is someone’s request.Response object, but it’s not your request.Response object”

                    Getting different module imports to succeed is more than possible, getting them to work together is another thing entirely.

                    One solution to this is the concept of visibility, which in Python is famously “not really a thing”. It’s safe to use incompatible versions of a library as long as the types are not visible - I.e no method returns a request.Response object, so the module is essentially a completely private implementation detail. This is how Rust handles this, I think.

                    However is obviously fucked by exceptions, so it seems pretty intractable.

                    • the_mitsuhiko 2 hours ago

                      That is not any different in Python than it is in Rust, Go or JavaScript. Yes: it's a problem if those different dependency versions shine through via public APIs. However there are plenty of cases where the dependency you are using in a specific version is isolated within your own internal API.

                      I think if Python were to want to go down this path it should be isolated to explicit migration cases for specific libraries that want to opt themselves into multi-version resolution. I think it would enable the move of pretty core libraries in the ecosystem in backwards incompatible ways in a much smoother way than it is today.

                      • kelnos an hour ago

                        I think the issue is more that in Python you could get confusing runtime failures. In Rust, it will fail to compile if you're trying to mix two different major versions of a dependency like that. I'm fine with the latter, but the former is unacceptable.

                        • dwattttt 11 minutes ago

                          You can have multiple incompatible dependency versions imported by the one create; you have to give them different names when declaring them, but it works fine (I just tested it).

                          It follows the approach of "objects from one version of the library are not compatible with objects of the library" mentioned above, and results in a compile time error (a potentially confusing type error, although the error message might call out that there's multiple library versions involved).

                        • orf 2 hours ago

                          The problem with this is exceptions: they easily allow dependencies to escape, be that via uncaught exceptions or wrapped ones.

                          Go and JavaScript have type systems and idioms far more amenable to this kind of thing (interfaces for Go, no real type identity + reliance on structural typing for JS) and rely a lot less on the kind of reflection common in Python (identity, class, etc).

                          I guess there are some use cases for this, I just feel that the lack of ability to enforce visibility combined with the “rock and a hard place” identity trade-off limits the practical usefulness.

                          • simiones 2 hours ago

                            Exceptions are no different from Go's error types (and in general interface types in any language) from this point of view. If moduleA is doing something like `errors.Is(err, ModuleBError)` on an error that was returned from moduleC which uses a diffferent version of moduleB, you'll get the same issue.

                            • orf an hour ago

                              That’s interesting - is it common to do this instead of casting to an interface?

                              It seems a lot more impactful with Python due to type equality being core to how exceptions are handled, even if there are similarities.

                              • simiones an hour ago

                                Well, the most common is of course `if err != nil`, which is unaffected. But in the very rare occasions that someone is actually handling errors in Go, `errors.Is` and `errors.As` are recommended over plain casts since they correctly handle composite exceptions

                                Say a function returns `fmt.Errorf("Error while doing intermediate operation: %w", lowerLevelErr)`, where `lowerLevelErr` is `ModuleBError`. Then, if you do `if _, ok := err.(ModuleBError) {...}`, this will return false; but if you do `if errors.Is(err, ModuleBError)`, you will get the expected true.

                                Regardless, the core problem would be the same: if your code can handle moduleB v1.5 errors but it's receiving moduleB v.17 errors, then it may not be able to handle them. This same thing happens with error values, Exceptions, and in fact any other case of two different implementations returned under the same interface.

                                You even have this problem with C-style integer error codes: say in version 1.5, whenever you try to open a path that is not recognized, you return the int 404. But in 1.7, you return 404 for a missing file, but 407 if it's a missing dir. Any code that is checking for err > 0 will keep working exactly as well, but code which was checking for code 404 to fix missing dir paths is now broken, even though the types are all exactly the same.

                        • snatchpiesinger 2 hours ago

                          You can have private and public dependencies. Private dependencies are the ones that don't show up on your interface at all. That is you don't return it, you don't throw it or catch it (other than passing through), you don't take callbacks that have it in their signature, etc... You can use private dependencies for the implementation.

                          It should be safe to use multiple versions of the same library, as long as they are used as private dependencies of unrelated dependencies. It would require some tooling support to do it safely:

                          1. Being able to declare dependencies are "private" or "public".

                          2. Tooling to check that you don't use private dependencies in your interfaces. This requires type annotations to gain some confidence, but even then, exceptions are a problem that is hard to check for (in Python that is).

                          In compiled languages there are additional compilications, like exported symbols. It is solveable in some controlled circumstances, but it's best to just not have this problem.

                          • orf 2 hours ago

                            > you don't throw it or catch it

                            Herein lies the issue: in this context exceptions can be thought of as the same as returns. So you actually need to catch/handle all possible exceptions in order to not leak private types.

                            Also what does “except requests.HttpError” do in an outer context? It checks the class of an exception - so either it doesn’t catch some other modules version of requests.HttpError (confusion, invariants broken) or it does (confusion, invariants broken).

                        • dikei an hour ago

                          One way is to publish 2 variants of your packages: one with the major version number appended to package name and one without the version number. Users who need to install multiple versions can use the first variant, while user that just want to follow the latest can use the second.

                          • pid-1 an hour ago

                            I have done the following in the past:

                            1. pip install libfoo==1.x.x

                            2. pip install libfoo==2.x.x --target ~/libs/libfoo_v2 # vendor libfoo v2

                            3.

                            import sys

                            import libfoo

                            original_sys_path = sys.path.copy()

                            sys.path.insert(0, '~/libs/libfoo_v2')

                            import libfoo as libfoo_v2

                            sys.path = original_sys_path

                            There are caveats of course. But works for simple cases.

                            • physicsguy 3 hours ago

                              Does this not fall over in the circumstance of linking against a C library (not specifically a Python extension) as many Python libraries do?

                              For e.g. I write “library” of which v1 depends on somelib.so.1.0.0 and v2 depends on somelib.so.2.0.0

                              If somelib has some symbols clashing in the names this can cause real problems!

                              • the_mitsuhiko 3 hours ago

                                They don’t show up in the global symbol namespace usually so it’s fine. It’s only an issue for some libraries that load globally so that one library can reference another c library.

                                • snatchpiesinger 2 hours ago

                                  Python dlopens binary wheels with RTLD_LOCAL on Linux, and I assume it does the equivalent on Windows.

                                  There were issues relatively recently with -ffast-math binary wheels in Python packages, as some versions of gcc generates a global constructor with that option that messes with the floating point environment, affecting the whole process regardless of symbol namespaces. It's mostly just an insanity of this option and gcc behavior though.

                                • TZubiri 2 hours ago

                                  Pip, Venv, poetry, pipenv, now uv

                                  If you are still struggling with this in 2024, you are missing the actual challenges of the world.

                                  • dikei an hour ago

                                    Just like JS has npm, yarn, pnpm, bun...

                                    It seems the more users a language has; the more dev tools get written for it.

                                    • TZubiri an hour ago

                                      Maybe if you don't import stuff like left-pad and actually write some code, you wouldn't have to write a dissertation on package management.

                                      • dikei an hour ago

                                        Actually, I prefer spending my time writing right-pad

                                    • simiones an hour ago

                                      So the fact that there are 5 different tools that (attempt to?) fix this problem is a sign that it is not a problem, from your point of view?

                                      • TZubiri an hour ago

                                        It's a sign developers get stuck in a paper bags. Same thing with text editors, orms, frameworks.

                                        Imagine 2050, flying cars, talking robots, teleportation, and John Developer is going to be releasing solution 57 to a problem that was solved in the 90s

                                      • globular-toast an hour ago

                                        Agreed, but who is struggling with it? Do you think tool development should just stop because it's not an "actual challenge"?

                                      • thingsgg 3 hours ago

                                        It's one thing to see if something like this is possible from a technical standpoint, but whether this is desirable for the ecosystem as a whole is a different question. I would argue that allowing multiple versions of packages in the dependency tree is bad. It removes incentives for maintainers to adhere to sane versioning standards like semver, and also the incentive to keep dependencies updated, because resolution will never be impossible for package users. Instead, they will get subtle bugs due to unknowingly relying on multiple versions of the same package in their application that are incompatible with each other.

                                        For lack of a better word, the single package version forces the ecosystem to keep "moving": if you want your package to be continued to be used, you better make sure it works with reasonably recent other packages from the ecosystem.

                                        • the_mitsuhiko 2 hours ago

                                          > It removes incentives for maintainers to adhere to sane versioning standards like semver

                                          Semvers does not matter in this way. The issue with having a singular resolution — semver or not — is that you can only move your entire dependency tree at once. If you have a very core library then you are locked in unless you can move the entire ecosystem up which is incredibly hard.

                                          • greatgib an hour ago

                                            Indeed, in my opinion it is the best way to finish in a cluster mess like the nodejs/npm ecosystem...

                                            And a very real issue is that young developers don't know anymore how to develop by limiting dependencies to the strict minimum. You have some projects with hundreds of dependencies without a real reason than lazyness or always using the new shiny thing.