-
-
Notifications
You must be signed in to change notification settings - Fork 1.2k
Using wheels to distribute Python packages
If your SPK uses Python packages you can use the 'wheel' format to distribute the packages together with your SPK. Read up on the format here.
Generally speaking, there are tree types of Python packages:
- Pure-python packages. Wheels are platform independent and self-contained for the most part in terms of dependencies;
- Packages with (optional) C-extensions. These packages have to be compiled with GCC, and require a (cross-)compiled Python to be available along with setting up a Python
crossenv
being setup; - Packages with enforced limited api/abi where you may need to limit API compatibility to Python 3.x (
cp3x
) and ABI to Python 3 (abi3
). Other than that theses are similar in every way to (cross-)compiled wheels using Pythoncrossenv
.
For spksrc specifically, we also define a fourth type of package that meet any of the following two types:
- Packages with C-extensions, which depend on other cross-packages at build time;
- Packages that need patches to be applied in order to create a working wheel.
This type of package generally requires a new spksrc cross package to be created. Generally speaking, these packages will also require a (cross-)compiled Python to be available.
By default spksrc does not include pure-python wheels but rather download them at installation time using pip
.
Otherwise, spksrc uses 3x distinct requirement files to handle wheels to be added to the WHEELS
variable in the SPKs Makefile:
-
requirements-crossenv.txt
is used for cross-compiled packages using pythoncrossenv
-
requirements-pure.txt
is user for pure-python packages -
requirements-abi3.txt
is for api/abi limited packages
Any other given name will be treated as Python crossenv
(cross-)compiled type (although Python default normally is requirements.txt
). This default behavior of (cross-)compiling using Python crossenv
can be changed by setting WHEEL_DEFAULT_PREFIX=pure
as needed. In order to create reproducible builds, all the required packages should be frozen to a specific version (e.g. docutils==0.17.1
).
The general building steps are the following:
- spksrc will store a
requirements-crossenv|pure|abi3.txt
in$(WORK_DIR)/wheelhouse
. - From there it will compile each of the needed types of wheels and then store the original in
$(WORK_DIR)/wheelhouse
. - Finally, spksrc it will rename the wheel so it always matches the machine-name of the target DSM and copy to
$(INSTALL_DIR)/$(INSTALL_PREFIX)/share/wheelhouse
for later packaging along with creating a consolidatedrequirements.txt
that will include all wheels of any given type. The renaming process is mandatory forarm
arches as they need to exactly refer to the device machine hardware name (e.g.armv5tel
andarmv7l
)
As for the fourth type of package (e.g. with C-extensions, which depend on other cross-packages at build time or that needs patches to be applied), a requirements-cross.txt
is being generated in background and added to the consolidated requirements.txt
used at installation time.
Rule of thumb is that most packages will only require (cross-)compiled wheels. Although when building a noarch
package or if a wheel doesn't compile properly in crossenv
we then fallback to pure-python wheels.
By default spksrc framework doesn't build the pure-python wheels but instead download them using pip
at installation time on your NAS device. As such, the requirements-pure.txt
file is used to provide the list of wheels to be downloaded at installation time.
If needed, it is possible to force building theses pure-python wheels by setting WHEELS_PURE_PYTHON_PACKAGING_ENABLE = 1
in spk Makefile. With that being set pure-python wheels become managed exactly the same way as packages with C-extensions below with the exception of using a requirements-pure.txt
file.
When building pure-python wheels using the requirements-pure.txt
file, spksrc will clear all the build flags and use the native/python310
host Python interpreter. The resulting wheels and requirements-pure.txt
are stored in $(WORK_DIR)/wheelhouse
for later processing and packaging.
For noarch
packages it needs the following:
- Add
BUILD_DEPENDS += native/python310
to the SPKs Makefile. This ensures having a host native Python and that other requirements are met. - Define the location of
pip
to point towards native Python such as:
PIP = $(WORK_DIR)/../../../native/python310/work-native/install/usr/local/bin/pip
It requires and does the following:
- Add
BUILD_DEPENDS += cross/python310
to the SPKs Makefile. This ensures that Python is (cross-)compiled and other requirements are setup correctly to create acrossenv
that includespip
,wheel
,setuptools
,cffi
andcryptography
andpoetry
. This crossenv is key to generate (cross-)compiled wheels later-on in the build process; - Add the requirement filename to the
WHEELS
variable in spk/Makefile. We suggest using the defaultrequirements.txt
filename when there only are cross-compiled wheels OR always userequirements-crossenv.txt
when there also is pure|abi3 wheels to be created. - In order to create reproducible builds, all the required packages are frozen to a specific version (e.g.
mercurial==4.0.1
); - spksrc will build a (cross-)compiled wheel using
pip
by including all the build flags and using thecrossenv
python interpreter along with the TC_ARCH toolchain; - The resulting wheels and
requirements-crossenv.txt
are stored in$(WORK_DIR)/wheelhouse
for later processing and packaging;
There are managed exactly the same way as packages with C-extensions above with the exception of using a requirements-abi3.txt
file. The API/ABI limitation can be set with the PYTHON_LIMITED_API variable such as PYTHON_LIMITED_API = cp35
(which will fallback to Python 3.5 API).
Usage of cross/
python wheels can often be circumvented by:
- Adding the needed
DEPENDS +=
to the SPKs Makefile (ex:DEPENDS += cross/c-ares
); - Including in the shell environment the package needed arguments (ex:
ENV += PYCARES_USE_SYSTEM_LIB=1
); - Adding packages frozen to a specific version (e.g.
pycares==4.1.2
) intorequirements-crossenv.txt
.
When above procedure is insufficient we must then fallback to using a cross package:
- Create a new package in
cross/
with the correct details. Add an include tospksrc.python-wheel.mk
in the Makefile so spksrc knows how to build the package; - Add
BUILD_DEPENDS = cross/python
to the SPKs Makefile. This ensures that Python is cross-compiled and that thecrossenv
requirements are setup correctly to create cross-compiled wheels; - Add the new cross package to
DEPENDS
in the SPKs Makefile; - In contrast to the other two types, this type of package should normally not be included in any
requirements-crossenv|pure|abi3.txt
.
The building steps are such as:
- spksrc will (cross-)compile Python, then process
python-cc.mk
. Due toinclude ../../mk/spksrc.python-wheel.mk
, spksrc creates a (cross-)compiled wheel. Wheel building process is invoked usingpython -c "import setuptools;...
call instead of usingpip
like it does for usualcrossenv
wheels. - The resulting wheels and
requirements-cross.txt
are stored in$(WORK_DIR)/wheelhouse
for later processing and packaging;
It is possible to pass arguments to pip
wheel building process by using the WHEELS_BUILD_ARGS
variable such as:
WHEELS_BUILD_ARGS = [Pillow]
WHEELS_BUILD_ARGS += build_ext
WHEELS_BUILD_ARGS += --disable-platform-guessing
WHEELS_BUILD_ARGS += --enable-freetype
WHEELS_BUILD_ARGS += --enable-jpeg
WHEELS_BUILD_ARGS += --enable-zlib
Theses will get converted into --global-option=build_ext --global-option=--disable-platform-guessing ...
and will be captured by setup.py
in the build process. The first argument of the list in brackets [...]
must match the wheel name (and is case sensitive). The usage of such extra argument is being captured only by the wheel as specified as the first argument. No other wheel builds are affected by it.
It is also possible to pass extra CFLAGS
, LDFLAGS
, CPPFLAGS
and CXXFLAGS
to the pip
build cross-compiling environment. It is done similarly to pip arguments above such as:
WHEELS_CPPFLAGS = [numpy] -std=c++0x
The first argument of the list in brackets [...]
must match the wheel name (and is case sensitive). The remaining arguments will be added to the default FLAG environment variable when calling pip
for the cross-compiling build only for that specific wheel.
It is possible to use non-PyPi URL such as github to download and install wheels. More information available: https://pip.pypa.io/en/stable/topics/vcs-support/
In order for wheels to be installed properly at destination you must also specify the version that the resulting wheel will have as suffix to the egg=
portion of the URL such as:
git+https://github.com/wiseman/py-webrtcvad@3bd761332a9404f5c9276105070ee814c4428342#egg=webrtcvad==2.0.10
When building the final package which will be including the various wheels, the resulting requirement.txt
file will be post-processed such as it will remove the URL prefix leaving only the wheel wheel_name==version
such as webrtcvad==2.0.10
. This will allow pip
to look for such wheel version on file when processing wheels at installation time on the NAS.
After the above, spksrc will resume its normal activities to build the SPK.
- To include
$(INSTALL_DIR)/$(INSTALL_PREFIX)/share/wheelhouse
in the SPK itself, addrsc:share/wheelhouse
to the SPKs PLIST. - In addition, the installer (usually
src/service-setup.sh
) should contain a line to install the wheels in a Pythonvirtualenv
on the target device at installation time. The generic format to create avirtualenv
and install wheels for SynoCommunity packages uses pre-defined sheel functions that are available at installation time:
# Define python310 binary path
PYTHON_DIR="/var/packages/python310/target/bin"
# Add local bin, virtualenv along with python310 to the default PATH
PATH="${SYNOPKG_PKGDEST}/env/bin:${SYNOPKG_PKGDEST}/bin:${PYTHON_DIR}:${PATH}"
service_postinst ()
{
# Create a Python virtualenv
install_python_virtualenv
# Install the wheels
install_python_wheels
}
The Mercurial SPK contains two Python packages: Mercurial itself, and Docutils, which is a dependency of Mercurial.
Mercurial needs cross-compiling because it contains C-extensions. In addition, it also has to be patched to ensure a working wheel is created. That means it's a package of the third type as previously described. Docutils on the other hand is a pure-python package.
Starting off with Docutils:
Mercurials Makefile sets WHEELS = src/requirements.txt
. This requirements file contains docutils==0.17.1
as its only entry.
As this only package builds properly using cross-compiling set by default the requirement filename is using default requirements.txt
. This is all that needs to be done to create a Docutils wheel.
For Mercurial itself, a bit more is needed:
-
spksrc/cross/mercurial/Makefile
is created with the correct content. There's no need for dependencies in this case, as docutils is handled via the requirements file. - The Makefile's include is set to create cross-compiled wheels:
include ../../mk/spksrc.python-wheel.mk
. - The appropriate patches for Mercurial are added to the
patches
directory. - A digests file should be created, to ensure the file download is not corrupted.
- The SPKs Makefile then needs the following:
BUILD_DEPENDS = cross/python
to cross-compile Mercurial.BUILD_DEPENDS
also containscross/mercurial
(although it could also be added toDEPENDS
as there's nothing in the PLIST) - The last step is to add
rsc:share/wheelhouse
to the SPKs PLIST.
Building the SPK via make arch-$(ARCH) should now result in two wheels in $(WORK_DIR)/wheelhouse
. The wheels are also stored in $(INSTALL_DIR)/$(INSTALL_PREFIX)/share/wheelhouse
, but with a uname -m
DSM architecture matching naming format to ensure the wheels are recognized as valid on the target device.
During the processing of the SPKs PLIST, the wheelhouse directory is copied to $(STAGING_DIR)/share/wheelhouse
.
Once the Python packages are successfully created and included in the package, you'll need to make sure the wheels are installed.
- In Mercurial installer, include the generic command to first create a Python virtual environment:
${VIRTUALENV} --system-site-packages ${INSTALL_DIR}/env > /dev/null
. Note that in some cases you might not want to use--system-site-packages
. - Install all available wheels into the virtual environment as follows:
${INSTALL_DIR}/env/bin/pip install --no-deps --no-index -U --force-reinstall -f ${INSTALL_DIR}/share/wheelhouse ${INSTALL_DIR}/share/wheelhouse/*.whl > /dev/null 2>&1
- Generally speaking, you should start with the assumption that all the required Python packages are pure-python. When building a pure-python wheel fails, the build process will halt with an error, after which you can decide what to do. A good next step is to assume that one or more packages should be cross-compiled, which means adding
BUILD_DEPENDS = cross/python
and see if that works better. - To identify if a package pure-python or not, in most cases the wheels name can tell you. If the wheel package is uploaded on PyPI you can search for the package. The following also applies to wheels created by spksrc:
- While debugging and finding the best configuration it is possible to use a unique
requirements.txt
file and append apure:
,cross:
orabi3:
prefix to the needed wheels. From there usingmake spkclean
will clean-up the wheelhouse in order for it to be regenerated at make time. Note that a final package must not use any prefixes. - It is required to pin packages to a specific version. Example:
mercurial==4.0.1
. Other version specifiers are not allowed. - It is required to add all requirements to the package. Upstream maintainers sometimes only list so-called top-level requirements for their packages, and rely on
pip
to process dependencies during installation. This can cause issues during installation of SynoCommunity packages. To make sure all the requirements are included in your final requirements.txt, runpip install -r requirements.txt
(without specifying--no-deps
) in a separate virtualenv. After pip has processed all the requirements, runpip freeze
, and use that output as starting point for the finalrequirements.txt
. - References to
setuptools
,pip
orwheel
should not be included in requirements.txt or be commented out. - Python packages that are processed as
DEPENDS
, or cross packages, should not be included in requirements.txt or be commented out. - Errors such as
command 'gcc' failed with exit status 1
means cross-compiling is required. - In some cases, wheels appear to build successfully as a pure-python wheel, but fail to install or work correctly on the target. Make sure to test the package, and if you run into issues, try to cross-compile the wheel instead.
- Some native wheel code may not compile without CFLAGS=-Wno-error=format-security
- Some wheel code archive like gevent does not include generated C code. Prefer to download wheel source archives PyPi.org from https://pypi.org/project/gevent/1.4.0/#files
- Home
-
Packages
- Adminer
- Aria2
- Beets
- BicBucStriim
- Borgmatic
- cloudflared
- Comskip
- Debian Chroot
- Deluge
- Duplicity
- dnscrypt-proxy
- FFmpeg
- FFsync
- Flexget
- Gstreamer
- Google Authenticator
- Home Assistant Core
- Jellyfin
- Kiwix
- [matrix] Synapse homeserver
- MinIO
- Mono
- Mosh
- Mosquitto
- Node-Exporter
- Radarr/Sonarr/Lidarr/Jackett
- SaltStack
- SickBeard Custom
- SynoCLI-Disk
- SynoCLI-Devel
- SynoCLI-File
- SynoCLI-Kernel
- SynoCLI-Misc.
- SynoCLI-Monitor
- SynoCLI-NET
- Synogear
- Concepts
- Development
- Resources