gpgme only builds against two versions of python at once
Closed, ResolvedPublic

Description

In debian unstable, we're currently aiming to build every python module against three version of python:

  • 2.7
  • 3.5
  • 3.6

at some point, we might add 3.7 -- i don't know the full plan.

however, configure.ac in gpgme only looks for (at most) two python versions.

At the moment, i'm working around this with a hacky patch that hardcodes 3.6, but it'd be nice if the upstream autoconf business tried to scan for all versions of python that are on the system instead of stopping at two different versions.

dkg created this task.Aug 18 2017, 4:42 AM
justus closed this task as Wontfix.Aug 21 2017, 11:24 AM
justus triaged this task as Wishlist priority.
justus claimed this task.
justus added a subscriber: justus.

Unfortunately, even building for two Python versions is a bit of a hassle with the existing autoconf framework for Python. I did that when porting the Python bindings back to Python2 after we decided to also support 2 so that people could start to use our bindings even if they still need Python2. I don't see us extending it for more versions.

Iirc the Gentoo people do build for multiple Python versions, and the maintainer added support for that. Surely that needs more documentation, but still, it should be possible.

dkg added a comment.Jan 12 2018, 8:14 PM

it's too bad that this is not considered something worth fixing upstream -- at the moment, debian's python3-gpg will only work with one specific version of python3 because of this, which makes package transitions more complex than they should be.

werner raised the priority of this task from Wishlist to Normal.Jan 13 2018, 5:01 PM
werner added a subscriber: werner.

The actual problem is that justus quit his job to work for pEp. Thus we have no maintainer for the python port. There is one candidate for this job but don't expect any fast fixes because one of the near term goals will be to replace swig so that we can provide the bindings also for WIndows. Maybe that will also solve the problem with different Python versions.

werner reopened this task as Open.Jan 13 2018, 5:01 PM
BenM added a subscriber: BenM.Mar 20 2018, 1:54 AM
werner removed justus as the assignee of this task.Apr 17 2018, 12:48 PM
werner moved this task from Backlog to Python on the gpgme board.Apr 19 2018, 6:08 PM
BenM claimed this task.Apr 30 2018, 12:02 AM

The last change to the python installer was, IIRC, one I discussed with Justus off-list around the middle of, um, last year? Maybe the year before?

Well, whenever it was, that's when the version of Python 3 it would find switched from 3.4 to whatever the most recent version installed was. The M4 code which runs that part of the installation is old, ugly and very hacky.

In theory getting it to install with any compatible version of Python it can find should be achievable, but getting that to work in M4 may be a bit of a stretch. Maybe we can add some kind of “snake hunter” routine to the configure script for GPGME to find all the valid Python installations and apply the installer to them? Though depending on the way it's done and the number of Python installations available it may add a significant amount of time to the installation.

Not least of which because each Python installation ends up with its own compiled lib bundle for GPGME. So my most recent installation on the OS X laptop has the libgpgme dynamic libraries (*.dylib) over in /usr/local, but I also have the Python bindings for Python 2.7 and 3.6. When the bindings were built, they were built as eggs and which ought to be found installed in the relevant site-packages directory (by default that should be /usr/local/lib/pythonX.Y/site-packages/module-name/ or something similar). A copy of those egg directories and all they contain, remains within the build directory.

These commands were called from within lang/python in my most recent build directory:

bash-4.4$ file /usr/local/lib/libgpgme.11.dylib
/usr/local/lib/libgpgme.11.dylib: Mach-O 64-bit x86_64 dynamically linked shared library, flags:<NOUNDEFS|DYLDLINK|TWOLEVEL|NO_REEXPORTED_DYLIBS|HAS_TLV_DESCRIPTORS>
bash-4.4$ file python2-gpg/lib.macosx-10.4-x86_64-2.7/gpg/_gpgme.so
python2-gpg/lib.macosx-10.4-x86_64-2.7/gpg/_gpgme.so: Mach-O 64-bit x86_64 bundle, flags:<NOUNDEFS|DYLDLINK|TWOLEVEL>
bash-4.4$ file python3-gpg/lib.macosx-10.9-x86_64-3.6/gpg/_gpgme.cpython-36m-darwin.so
python3-gpg/lib.macosx-10.9-x86_64-3.6/gpg/_gpgme.cpython-36m-darwin.so: Mach-O 64-bit x86_64 bundle, flags:<NOUNDEFS|DYLDLINK|TWOLEVEL>
bash-4.4$

Oh, those ones for each Python installation, they're about two and a half times the size of the .dylib file so I'm assuming they contain the rest of the relevant libraries accessed by the version of gpgme.h interpreted at compile time (just to draw a reference back to the "why should that make a difference" thread on -devel — *this* is why).

As for why it has a .so extension and not something OS specific; it's a binary and I don't think the name particularly matters, so it landed on the *nix default. Most people won't even see it, even if they do play with the bindings themselves regularly.

Now, here's the thing, those .so files are all to be compiled by whichever stack is selected by the configure scripts, probably gcc and maybe sometimes clang, depending on the system. Do we really need a separate one built for every Python? Or can we build a Py2 version and a Py3 version and let all the minor releases access those? Or would that be too likely to encounter conflicts of some kind?

Personally I'd err on the side of caution and not breaking things just to get more of these snakes on these plain old systems.

Sorry, couldn't help it. 😉

Still, taking the cautious approach and building each one individually, even if it adds significantly to install time I don't have a problem with at all. It'll probably still be a tiny amount of time compared to some other things floating around these days (I recently looked at what goes into a Rails stack — that's a lot of code).

BenM added a comment.Jul 23 2018, 10:31 AM

While performing some initial investigation regarding observed discrepancies between compiling GPGME directly and the subsequent SWIG static object for T4086, confirmed the relative ease by which multiple installations would be achievable if performed as a post-build process. This would have the added advantage of being more readily customisable by package maintainers downstream and not just for Debian, it could be made to work more easily with other distributions or other posix systems too.

Once the precise method of doing that is finalised, it should then be a simple matter to either automate it from the standard configure/make dance or to run it manually. It should also make it easier for others to be more selective over which specific installations of Python they want the bindings installed to.

There is, however, a price to everything and in this case this means making sure the current process only ever results in building the modules for either a generic Python 2 (which would always be 2.7) and Python 3, but to no longer install either of them. This would be the point where new scripts or commands were initiated. This stage would locate or specify the specific python installations to install to, build for each of them via the generated setup.py made with the existing process, copy the module produced into the corresponding site-packages location and leave relevant build output in the lang/python/ directory as is currently the case, except indicating which path or installation it's for. This could be done by either changing slashes or backslashes to an underscore or something; or by creating a matching directory structure of what would normally be the $PREFIX for that python version (as distinct from the $PREFIX for GPGME itself).

The "quick and dirty" way to do this would be as a shell script, but in all likelihood it will ultimately be implemented in Python 3 (with a little subprocess and a lot of os.path with os.walk). As with installing the rest of GPGME, this will need root or admin access (depending on the system, obviously single user systems like Termux or policy controlled systems like SELinux are different).

BenM added a comment.Jul 25 2018, 3:46 PM

This question and some of the answers to it on StackOverflow indicate some of the difficulties in getting SWIG generated Python modules to install at all. Essentially, though the easiest method currently available without extensive customisation of the setup.py file which would need to be done for both Python 2.7 and Python 3.x is to run /path/to/specific/pythonX.Y setup.py build and then follow that with /path/to/specific/pythonX.Y setup.py install and then follow that with renaming lang/python/build to a relevant directory and/or path name which indicates which version of python was used and the location or path it is in.

dkg added a comment.Oct 17 2018, 12:53 AM

what's the status on this? i'd love to be able to build binaries for both python3.6 and 3.7 for debian. as it stands right now, the python3.7 continuous integration test for debian is failing.

"dkg (Daniel Kahn Gillmor)" <noreply@dev.gnupg.org> writes:

what's the status on this? i'd love to be able to build binaries for
both python3.6 and 3.7 for debian. as it stands right now, the
python3.7 continuous integration test for debian is failing
https://ci.debian.net/data/autopkgtest/unstable/amd64/g/gpgme1.0/1158040/log.gz.

This should already be possible, iirc the Arch Linux maintainer patched
it in. I believe there is a 'prepare' target that takes care of all the
preparations (duh), and then you can build for every Python version by
executing the Python build system with the Python version of your choice.

werner changed the task status from Open to Testing.Oct 18 2018, 11:48 AM
dkg added a comment.Oct 19 2018, 11:47 PM

@werner, thanks for rMff6ff616aea6 -- i've backported it to debian's packaging and it lets us cleanly build against all installed versions of python.

BenM added a comment.Oct 20 2018, 12:53 AM

This should already be possible, iirc the Arch Linux maintainer patched
it in. I believe there is a 'prepare' target that takes care of all the
preparations (duh), and then you can build for every Python version by
executing the Python build system with the Python version of your choice.

That's along the lines of a variation I was working on; using the
basic build to generate those components needed prior to building the
bindings and then launching each python version with two "setup.py
build" calls, followed by a "setup.py install" call worked pretty
consistently every time. I guess I don't need to continue that if
this iteration works.

BenM added a comment.Nov 3 2018, 12:43 PM

While this is now ideal for Debian, it may cause conflicts with other downstream vendors with slightly different needs to build their packages. In particular the FreeBSD ports and/or pkg system.

Being able to install to all versions or all detected versions of Python is great and even to do so as a default, but we must also have a way of explicitly specifying a single Python installation as well.

BenM added a comment.Nov 3 2018, 12:54 PM

MacPorts doesn't currently ship the bindings at all, but I'll see what they need to make that a reality too.

It would most likely be included as a variation for when installing GPGME. I've usually modified the configure arguments in the portfile manually and never had any problems.

werner closed this task as Resolved.Nov 5 2018, 8:44 AM

I consider this bug to be solved.

If someone needs an option to build just against a single python versions a new feature request should be opened.