How to build packages outside conda-forge public infrastructure?

Hello guys,

I feel a litlle bit confused as I’m trying to build a set of package and I don’t realy know what tools to use and how to use them. What I’ve tested so far:

  • conda-build: it works, but it is slow
  • conda-build + boa: it’s definitively faster but it is also stuck to libmamba 1.4.2 (1.5 is availabe and multiple minor versions are available for the 1.4). Is this project maintained?
  • conda-build + conda-libmamba-solver+ “solver: libmamba” in the “.condarc”: it seems unstable (wrong libraries are installed leading to “Library not found” during tests / Random errors during dependency resolution)

It’s important to note that:

  • I’m not using the public conda-forge infrastructure, I build multiple projects locally in a private network.
  • I use “conda-forge-pinning-feedstock” pining versions as input for conda-build

What’s the best way to build packages in this context? Have you some advices? Is there any documentation somewhere for this use case?

Thank you for your help.

Hi, I am curious about these incompatibilities. Can you tell me more about which conda/conda-build/conda-libmamba-solver versions you were using? I can probably assist with some fixes.

That aside, regarding your question, maybe your best bet is using the provisioning scripts under .scripts, maybe with some adjustments.

Personally I just use conda build and accept that’s it’s kind of slow. :slight_smile:

We typically use conda build with the libmamba-solver. That works great for us.

I think that mixing anaconda default channels with conda-forge can cause some problems in some installations. We just remove the anaconda defaults and use conda-forge as the default. This is what you get if you installed using the miniforge distribution instead of anaconda or miniconda.

Also on this topic, if you are building pure python packages, I recently wrote the whl2conda tool to convert pure python wheels directly to conda packages. Because it does not need to build any environments, it only takes seconds. You still need to build an environment if you want to test your conda package of course:

Actually there are no incompatibilities but only bug with some package (libabsl_cord.so.2301.0.0: cannot open shared object file · Issue #274 · conda-forge/rasterio-feedstock · GitHub) that hopefully will be fixed soon.

For the moment these scripts seem to use mambabuild and this what we use for our projects.

What’s the best stategy regarding the compilation of softwares that uses native libraries? Do you use the global pinning configuration file ? (https://github.com/conda-forge/conda-forge-pinning-feedstock/blob/main/recipe/conda_build_config.yaml) Is it actually a good idea to use this file?

We use this file with conda-build to be sure our softwares can found the packages/libraries they need, but we still have conflicts when resolving the dependencies and our build is stuck… is it normal? (e.g. hdf5 causes conflicts with gdal and opencv. How are we supposed to manage a lot of 3rd party native libraries when packaging multiple softwares?

1 Like

Awesome! This is super useful. Does the tool create packages for the dependencies recursively if needed?

conda-smithy @ main contains changes for more configurability here, and will be released soon. That will allow you to say conda_build_tool: conda-build+conda-libmamba-solver instead of the current mambabuild default.

No, whl2conda does not check for the presence of dependencies. Maybe in some future release after it is a little more mature, plus doing that recursive dependency check is pretty much the same as solving an environment, which I am trying to avoid. Mostly this is intended to speed up construction of conda packages that are under your own control and whose dependencies you understand (and have already converted if necessary).

1 Like