Contributing to conda-forge
- Create the conda packages of the repos you want to have available
Two ways:
A) Use conda skeleton
for things that are already on PyPI
B) Use conda build from a custom made meta.yml, eg if you want to build for a custom repo that is not on PyPI, or for a version of that repo/package not yet on PyPI
Let's start with A)
Conda packages
conda build .
Had to make a package for pyaaf2 too since it wasn't on conda
package:
name: "{{ name|lower }}"
version: "{{ version }}"
source:
url: "https://pypi.io/packages/source/{{ name[0] }}/{{ name }}/{{ name }}-{{ version }}.tar.gz"
sha256: 160d3c26c7cfef7176d0bdb0e55772156570435982c3abfa415e89639f76e71b
build:
noarch: python
number: 0
script: "{{ PYTHON }} -m pip install . -vv"
requirements:
host:
- pip
- python >=3
run:
- python >=3
test:
imports:
- aaf2
- aaf2.model
- aaf2.model.ext
about:
home: "https://github.com/markreidvfx/pyaaf2"
license: MIT
license_family: MIT
license_file: LICENSE
summary: "A python module for reading and writing advanced authoring format files"
doc_url: "https://pyaaf.readthedocs.io/en/latest"
extra:
recipe-maintainers:
- vvzen
CONFIG=osx64 ./.scripts/run_docker_build.sh
#################################################################################### Resource usage summary:
Total time: 0:21:02.0 CPU usage: sys=0:00:01.5, user=0:00:03.2 Maximum memory usage observed: 60.0M Total disk usage observed (not including envs): 2.2K
- touch /home/conda/staged-recipes/build_artifacts/conda-forge-build-done
then I forked the conda-recipes repo, made a branch, added the recipe, made a PR, waited for the CI to kick in, update the .yaml based on the initial feedback from the conda-linter bot, update the PR based on the human review, and then have it merged! All of that in less than 1 hour!!
noarch: if it's a pure python package.
Then, a new github repo was created:
conda-forge/pyaaf2-feedstock
And you can see that the package is now available on conda-forge: https://anaconda.org/conda-forge/pyaaf2
Now I can go on with OpenTimelineIO. In an ideal world, I could have just used conda skeleton, like this:
conda skeleton pypi opentimelineio
but since the latest PyPI version of OTIO has a tiny typo in their setup.py
, I had to go the manual route once again..
conda build . --channel conda-forge
Notice that I added the conda-forge
channel since that's where the pyaaf2 package lives now.
#################################################################################### Resource usage summary:
Total time: 0:04:00.6 CPU usage: sys=0:00:04.9, user=0:01:38.5 Maximum memory usage observed: 1.9G Total disk usage observed (not including envs): 256.5K
package:
name: "{{ name|lower }}"
version: "{{ version }}"
source:
- url: https://github.com/PixarAnimationStudios/OpenTimelineIO/archive/refs/tags/v0.13.tar.gz
sha256: 33a63891b4656804242512e122b33ed12e35d4038fd78610ccb82b441b9506dd
- url: https://github.com/pybind/pybind11/archive/refs/tags/v2.6.2.tar.gz
sha256: 8ff2fff22df038f5cd02cea8af56622bc67f5b64534f1b83b9f133b8366acff2
build:
number: 0
script: "{{ PYTHON }} -m pip install --no-deps --ignore-installed . -vv"
requirements:
build:
- "{{ compiler('cxx') }}"
- cmake
host:
- pip
- pyaaf2
- setuptools
- python
run:
- pip
- pyaaf2
- python
test:
commands:
- make test
about:
home: https://github.com/PixarAnimationStudios/OpenTimelineIO
license: Apache-2.0
license_family: Apache
license_file: LICENSE.txt
summary: Python API for interchange of editorial cut information
doc_url: "https://opentimelineio.readthedocs.io"
extra:
recipe-maintainers:
- yournamehere
Problem was that OpenTimelineIO is vendoring some dependencies, so I had to at least add pybind11.
TravisCI (OSX, IMB Power 8+) CircleCI (Linux, OSX) Azure Pipelines OSX, Linux (x86_64, native), Linux (ARMv8, emulated) and Linux (IBM Power8+, emulated)
pip install --no-deps Don't install package dependencies.
python setup.py install --single-version-externally-managed used by system package builders to create 'flat' eggs
use bash! fish is fun I was getting additional issues..
If you want to test packaging locally, you'll need docker otherwise you can try packaging just for your system
https://packaging.python.org/overview/