-
-
Notifications
You must be signed in to change notification settings - Fork 385
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update pip package #667
Comments
we should remove this package indeed, or update it with a message suggesting to install via conda instead. |
Please update instead of removing it ;) |
would indeed be nice to update the pypi package instead of removing it :-) |
@jf--- @tpaviot Friendly ping. 😇 Do not know how hard it would be, but would you accept a PR to fix this? Or is it simply that you do not want to support non-Conda installations at all? If so, is there a technical reason/limitation? It seems to me like Python wheels could help nowadays distributing the package through PyPI, but I do not know much about them, so I may be missing something. |
Hello, does Python OCC work in python 3.7 or 3.8? Why doesn't it have been updated? |
I'm not a conda user, is there any benefit to making something a conda package instead of a pip package? I've honestly never seen that done before. |
Related to Azure/azure-functions-core-tools#796 |
@traverseda a bit of history. When I started pythonocc, in 2008 (12 years ago !!), pip was only able to install 100% pure python packages. It was not able to install precompiled binary packages that contain c or c++ code. At that time, pip used to download the source code and launch the build process on your machine. It was perfect on Linux, where a c/c++ compiler is installed by default, but not for Windows users: Windows users usually don't have a c/c++ compiler installed. Most of the questions I had were : "How to I install mingw32 on my machine ?". A nightmare. In 2010, when the project started to spread, the question was not "pip" or "conda" or whatever, but: "what's the best way to distribute precompiled binaries for each platform ?". conda was born to solve this issue. It saved many packages maintainers life. Later, python egg/wheels provided the same solution. It became possible to easily install pythonocc using conda, everybody was ok with that. Nobody asked for another solution, since this one does the job. Now, here is the point: creating/maintaining a pip package is a dozens-of-hours commitment per year. I have serious doubts that this investment would bring any valuable improvement, compared to the same amount of time I could spend on other issues. Just convince me. Or take the lead on this topic. |
@tpaviot In an attempt to do so... 😇 PyPI packages are more accessible, and I think that is a pretty good reason. If you have Conda, you can install packages with You could argue that those people can install Conda, but sometimes that is not possible. Nowadays many people deploy projects to the cloud and many cloud services do not offer Conda-based images (at least not in their "free" tiers"). Some offer Docker-based images (which allows you to use Conda or whatever you want), but at a different price (usually higher of course). This can hurt even toy projects. I have been in a situation where I did not deploy a toy project simply because it was going to cost me money if deployed with the Conda requirement (and I was of course not going to make any money out of it). You do have a maintenance burden when adopting pip, but maybe it is not much more than the burden you have for Conda (I may be wrong here though, since I have no real experience packaging wheels). What I would suggest is not to have both packages, but only the wheels, since that works for both |
The main argument against wheels is, that you can't bundle binary only packages. Conda allows this and this is the reason why we started creating conda packages. Also, conda provides a nice build infrastructure to create packages, which is not given otherwise (especially compiled packages). One other issue is the maintenance overhead @tpaviot already mentioned. Building 1 pythonocc configuration already takes an hour. Adding another hour for occt. Multiply this with the number of python versions, architectures and operating systems. It could be possible to combine the occt and pythonocc conda packages into a wheel, avoiding recompilation. This is IMHO the only practical way for a small team of maintainers with limited compute resources and spare time. |
@rainman110 Thanks for the clarification. Being an OSS project, did you have a look at GitHub actions to maybe take care of that building process for you? Limits seem to be generous. |
We're already running on azure which is pretty similar to Github Actions. |
Interesting talk related to conda-press here https://www.youtube.com/watch?v=ovMqBLspkK4 @scopatz |
Hi there. I tried to use |
PRs to conda-press are very welcome! |
Another reason for supporting pip: Read the Docs. When you host documentation on Read the Docs, you can choose your build system: either pip xor conda. Locally, on your system, you can of course use both. But not RTD. |
So what is the problem with conda and RTD that would be solved by pip? |
@adam-urbanczyk More packages are available on PyPI. In my project, there are packages that can only be installed through pip. |
Well, you can use conda+pip (also on RTD), take a look e.g. here: https://github.com/CadQuery/cadquery/blob/2d721d0ff8a195a0902eb9c3add88d07546c05b1/environment.yml#L23 |
Thanks! I didn't know about the |
So the main reason why you should support pip is so that I can add That's a pretty big difference, and opens up the potential for this to work with projects like python-appimage and pyinstaller. |
What's more, the conda package manager is often extremely slow and unsuccessful in resolving dependencies, in which case it does not allow installing the other packages either. |
It also doesn't seem to work with existing virtualenvs? Or at least I can't get it to work. So that means no pipenv, no poetry, no pipx, etc. Using conda locks you out of a lot of a lot of the python ecosystem, and I think that's only going to get worse over time. |
I don't know about python-appimage, but you can definitely bundle packages with pyinstaller from a conda env. I used it to build a pythonocc based app with success. If this is so important for you, you should consider contributing to e.g. conda-press (see above). |
There are good reasons to support pip, without any doubt! But there are currently technical limitations or at least difficulties that makes the creation of wheel packages very hard. So instead of focusing on the pip vs conda discussion, the discussion should be more on how to actually make pip packages! @traverseda Feel free to contribute! |
Each package management system is very specific, and require strong technical skills as well as a long experience. Nobody here has ever decided to drop pip support in favor of conda. Five years ago, it was easier to get done with conda, that's all. And then we had to maintain the package, and everybody lived happy untill @roipoussiere open this issue 😊 There are plenty of cool package management systems: pip, apt, yum, brew, choco etc. Impossible to provide packages for all of them. I hope someday someone will try to standardize this mess, but in the meantime indeed anyone is invited to contribute a pip/anything else package. |
Right, so I don't have a lot of experience with build systems, aside from occasionally changing a few flags here and there. My day job and my failing business keep me pretty busy these days, but it's great to know that's something the maintainers here are interested in. Looking through the install directions I notice that it relies on particular libraries installed on the host OS? Most of the time with wheel packages I've seen them pull down all the sources and build them. Basically do a static build and not rely on the OS package manager at all. It's my understanding that that makes the whole thing a whole lot easier to work with, once you actually get it running. Does that make sense? Having the install download/include all it's dependencies? Or is that going to end up being too much? Basically I'm imagining that if we can get cmake to build the whole thing as a completely static library, not really link against any host packages, that should be most of the battle. Anyone following this issue with significant cmake experience? Because that's where I'm going to struggle the most. Other than that, I'm not sure what technical limitations you're expecting, like I say I don't really have experience with conda or what problems it's solving. I think if we can get the cmake stuff working I should be able to get it building/packaging from the But like I say, I don't really know what I'm talking about here, so please feel free to correct me if I'm oversimplifying or misunderstanding anything.
I think pip is probably a good option, I don't think anyone is going to be concerned about you not providing apt or yum packages, that's a job for distro maintainers. I also don't imagine anyone would be too worried about you not providing conda packages if a pip package is available? The goal here would be to try and replace the conda package with a pip package? Maintaining both seems like it would be a pain. |
Avoiding making monolithic packages like this is exactly why conda was invented. With conda you can pull OCCT from e.g. conda-forge and just use it as your dependency. I imagine some of the current users will be worried, because such a monolithic pip package will possibly break their env. |
What do you mean by monolithic? You saying that we'd need to also make gmesh a package, and link against that? |
PythonOCC obviously depends on OCCT, which in itself has nothing to do with Python at all (it is a C++ lib). I assumed that you want to distribute it in your package, hence the word monolithic. That could result in issues if I used another pip package that also distributs (possibly incompatible) OCCT. |
I mean obviously it wouldn't install the compiled OCCT to the same place on the OS as the native package manager. It would install it into a folder belonging to that particular package, meaning you could have different versions of the library in different virtual-envs and all that stuff. At no point should a pip package be installing statically-compiled files anywhere but into their own package namespace. My understanding is that the conda package is already distributing it that way, and it's only the compile step that relies on the native package manager (apt/yum/whatever) having OCCT/SMESH installed? That's what I understood from going through |
I'm not referring to the OS but to a different python package depending on OCCT. On linux you can use rpath for isolation, but what do you intend to do on win? Static linking or adding UUIDs to the DLL names? I'm just trying to point out that it will likely become a mess rather fast. Conda BTW is not working like that - it has a separate package for OCCT and a separate packages for PythonOCC which depends on the former. The compile step (conda build) actually uses conda to fetch the deps, so you can rely on other people providing OCCT builds. |
On 10/2/20 6:48 PM, traverseda wrote:
<snip> .... I also don't imagine anyone would be too worried about you
not providing conda packages if a pip package is available? The goal
here would be to try and replace the conda package with a pip package? ..
I, for one, would be *very* unhappy to have a pip package instead of a
conda package.
Conda is far superior to pip, especially in managing complex packages with
C, C++, and other libraries, which is the main motivation for developing
conda in the
first place: scientific and engineering packages often involve complex
library
dependencies that pip is not good at managing.
If you don't know about that, you should read some of the history of
conda -- it would be worth your time, especially if you are going to
continue
editorializing about it.
Steve
|
Can you explain a bit more about how not providing a conda package would impact your workflow? Does your environment not use any pip packages? As an end-user of the library, how does not packaging via conda actually change anything? Are you swapping out the shared libraries each package depends on? I get that for a long time it was pretty much the only way to distribute binary packages but other than that I'm having a hard time finding any advantages for end users, other than it being an alternative/wrapper for virtualenv's. One of the big advantages of pip for binary packages is that you can have a binary distribution (wheel) and source distribution (sdist) in one pip package. So that means if I wanted to install a package on a weird CPU architecture, it will just build the package for that architecture. I've found that very useful when working on projects in the embedded world, particularly using buildroot-based linux distros with AArch64 numpy. I mean before you're too dismissive about the pip ecosystem you should take a moment to understand why a lot of users want it, aside from just not wanting to be forced into the conda ecosystem.
How does this conda package solve that problem? It does allow you to install different versions of the package into different environments right? Is it bundling it's own linker? CMake is definitely an area I'm weak in, but I'd probably try to follow along with something like this. Most of these presume a single shared-library per python package. Not sure if that would be a deal breaker, as I'm still having trouble understanding what advantages the conda package provides, aside from being easier on the maintainers. A lot of bigger packages do have a conda package and a pip package, I think it's a lot easier to go from wheel to conda than the other way around. I agree that for a package like this it would probably be a lot of work. Basically it would mean maintaining two completely different build systems, which is not something I think anyone would want. |
On 10/3/20 12:50 PM, traverseda wrote:
If you don't know about that, you should read some of the history
of conda -- it would be worth your time, especially if you are
going to continue editorializing about it.
Can you explain a bit more about how not providing a conda package
would impact your workflow? Does your environment not use any pip
packages? As an end-user of the library, how does not packaging via
conda actually change anything? Are you swapping out the shared
libraries each package depends on? I get that for a long time it was
pretty much the only way to distribute binary packages but other than
that I'm having a hard time finding any advantages for end users,
other than it being an alternative/wrapper for virtualenv's.
I do not use any pip packages, I exclusively use conda. The conda
package manager comes with its
own "pip", so you /can/ do "pip install" inside a conda environment.
However, if you want to build
a conda package that has dependencies for which conda packages are not
available, you will want
to create conda packages for them -- fortunately, if they are pure
python packages that is almost
trivially easy: conda has a command "skeleton" that can create a conda
"recipe" from a PyPI
package, and then just run "conda build [package]" on the recipe, done.
I have done that many
times, since I use only conda to build my packages.
One of the big advantages of pip for binary packages is that you can
have a binary distribution (wheel) and source distribution (sdist) in
one pip package. So that means if I wanted to install a package on a
weird CPU architecture, it will just build the package for that
architecture. I've found that very useful when working on projects in
the embedded world, particularly using buildroot-based linux distros
with AArch64 numpy.
I don't find that to be an advantage, but then I do not work in the
"embedded world", so I can't speak to that.
I only deploy my apps on Linux, Windows, and OSX.
I mean before you're too dismissive about the pip ecosystem you should
take a moment to understand /why/ a lot of users want it, aside from
just not wanting to be forced into the conda ecosystem.
I understand a lot about the pip ecosystem, since I have been
programming in Python for more than
20 years so I only used pip until a few years ago when conda was invented.
Static linking or adding UUIDs to the DLL names?
How does this conda package solve that problem? It does allow you to
install different versions of the package into different environments
right? Is it bundling it's own linker?
It does bundle a lot but if you are interested in that level of detail
you need to consult the docs.
Conda is quite powerful and can isolate its virtual environments (which
are different from pip's)
extremely well.
CMake is definitely an area I'm weak in, but I'd probably try to
follow along with something like this
<https://martinopilia.com/posts/2018/09/15/building-python-extension.html>.
Most of these presume a single shared-library per python package. Not
sure if that would be a deal breaker, as I'm still having trouble
understanding what advantages the conda package provides, aside from
being easier on the maintainers.
A lot of bigger packages do have a conda package /and/ a pip package,
I think it's a lot easier to go from wheel to conda than the other way
around. I agree that for a package like this it would probably be a
lot of work. Basically it would mean maintaining two completely
different build systems, which is not something I think anyone would want.
Actually I think it's easier to go from conda package to wheel --
earlier in this thread, Thomas
included a link to Anthony Scopaz's excellent presentation on "conda
press", I strongly
recommend watching it if you are interested in this level of detail!
https://www.youtube.com/watch?v=ovMqBLspkK4
…
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#667 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAFNJEPFSJAVKDJTLX24ECLSI5I4BANCNFSM4IJHZ7OA>.
|
I mean that's just not how 99% of the python world operates. Are there any end user features you'd be missing if this was a pip package? Are you like dynamically swapping out shared libraries? Are you linking directly to specific shared libraries from c/cpp programs embedded along side the python code? |
I will give @scopatz talk a watch when I have a moment later tonight. I'm a big fan, "xonsh" is my primary shell. |
I'm not referring to how conda is isolating envs, I'm asking you how do you want to isolate different packages (say NetGen and PythonOCC) that depend on OCCT (shared lib) installed in the same env. To my understanding there is no good way of doing this with pip, because it is not (does not want to be?) a "general" package manager. Conda, on the contrary, is and allows to just install an OCCT package. |
In my company we use databricks as a Ai Development enviornment and in the newer versions of databricks conda got completly removed and replaced with virtualenv and pip. The only way to work around this is to use some custom docker containers, but that is not a very good solution cause some features then dont work anymore. So is there a way to use python occ with databricks without using a custom dockercontainer ? |
We use Azure Functions on Python, so we're limited to only I tried to How would one approach the task of building a standalone wheel for pythonocc-core for all of the existing linux-64 packages available on conda-forge? I'm at my wit's end here. |
Just want to point out that If this is published and used by cadquery package, is there a limitation of also publishing pythonocc-core? |
The pythonocc-core pip package has not been updated since Dec 4, 2014 and use pythonocc-core v0.16. It should be updated to the last version v0.18.1.
The text was updated successfully, but these errors were encountered: