paint-brush
Developing, Packaging and Distributing a Python Libraryby@darshitac
789 reads
789 reads

Developing, Packaging and Distributing a Python Library

by Darshita ChaturvediMay 24th, 2022
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

How to use new packaging standards with virtual environment tools — adapted from the official documentations of python.org and Pipenv.org. We will adapt these steps to use them with virtual. environment tools. The earlier standard was to use pip freeze command to generate the requirements.txt file. This file would be read in setup.py file while packaging and distributing the library. We can now get requirements programmatically read Pipfile instead of generating a requirements.txt from PipFile.
featured image - Developing, Packaging and Distributing a Python Library
Darshita Chaturvedi HackerNoon profile picture

How to use new packaging standards with virtual environment tools — adapted from the official documentation of python.org and Pipenv.

Introduction

When my team and I started developing and shipping new Python libraries, we ran into two roadblocks:

  1. The packaging standards we knew had been deprecated. 
  2. We could not find any good resources to use these new packaging standards in a virtual environment.

Since we were developing our Python library in a virtual environment (specifically, Pipenv), we had to figure out a way to use these new packaging standards with the virtual environment tool. So, we decided to build our own approach by reading the official documentation of python.org and Pipenv. 

But first, let us briefly look at the last two packaging standards.

How packaging standards have evolved?

Generate requirements using pip freeze command

The earlier standard was to use pip freeze command to generate the requirements.txt file. This file would be read in setup.py file while packaging and distributing the library.

This created two key problems:

  1. pip freeze
    command would copy all the libraries installed in your system into the
    requirements.txt
    file while you only needed the ones used in this project. More exhaustive means better packaging, right? Sorry but the answer is no — you want your library to have minimal requirements so that it does not create clashes during installation.
  2. You may say that you’ll manually edit this file and remove the libraries that you know that you are not using. It will work, right? Sorry but the answer is no again — this is because you do not know what are the dependencies of such libraries. By deciding to manually edit the
    requirements.txt
    file, you may very well be deleting these dependencies. 

Generate requirements in a virtual environment

Virtual environments allowed us to move away from the

pip freeze 
command. For example, in Pipenv, you would use
pipenv lock -r > requirements.txt
command to generate the
requirements.txt
file. 

How was this helpful? Didn’t we just swap out a command?

True but it solved a key pain point with the earlier standard — now our requirements file only contained the libraries that we were using in this project and not the entire system. 

But this standard still had one problem — there was always a chance of misconfiguration while setting up CI/CD pipeline. Hence, errors like ending up with an outdated

requirements.txt
file were frequent and quite plausible.

If you are interested in learning more about what has been deprecated and what are the new alternatives, you can check this blog

New packaging standard and virtual environment tools

The new packaging standard is not directly compatible with the commonly used virtual environment tools. For example, according to the new standard, we should use the

python -m build
package instead of running
python setup.py sdist

Running

python -m build
creates a virtual environment, installs dependencies, etc. However, it fails to install dependencies from Pipfile if you are using a virtual environment tool such as Pipenv.

Since we were developing our Python library in a virtual environment (specifically, Pipenv), we had to figure out a way to use these new packaging standards with the virtual environment tool. 

Development 

Use a virtual environment like Pipenv while developing the Python library. A good guide on Pipenv can be found here.

Packaging a Python library in a virtual environment

A good place to start is to first read through Python.org’s tutorial on packaging a simple Python project. We will adopt these steps to use them with virtual environment tools.

Step 1: Creating the package files

Step 2: Creating

pyproject.toml

Step 3: Configuring metadata

We can now get requirements programmatically read Pipfile instead of generating a

requirements.txt
from PipFile.

Below is an example of how we can do it — an excerpt from the

setup.py
file in Streamlit’s Github repository.

import sys
import setuptools
try:   
    
    from pipenv.project import Project   
    
    try:      
         
         # Pipenv 2022.4.8      
         
         from pipenv.utils.dependencies import convert_deps_to_pip      
    except:     
    
    # Older Pipenv     
    
    from pipenv.utils import convert_deps_to_pip 
except:  
    
    err_msg = "Please install pipenv and try again."        
    sys.exit(err_msg)
pipfile = Project().parsed_pipfile
packages = pipfile["packages"].copy()
requirements = convert_deps_to_pip(packages, r=False)

We can use these requirements while calling

setuptools.setup
function in
setup.py
file.

setuptools.setup(
...
install_requires=requirements,
...
)

Step 4: Generating distribution archives

Note that running PyPi’s recommended

python -m build
package creates a new virtual environment. However, we want
build
package to use the virtual environment created by
pipenv
instead of creating its own.

What if we simply add a flag to the command python -m build -n that tells it to not create its own virtual environment?

Unfortunately, this still fails. 

Note that earlier

setup.py
used to have two steps — sdist and wheel. 

If we add a flag to the command

python -m build -n
 , the code we wrote in the previous step to generate requirements dynamically passes in the sdist step (this step generated .tar.gz file) but fails while creating a wheel. You will get a NonExistentKey error.

This might get fixed in a future version of

build
 . Until then, you need to run sdist and wheel step separately. 

  1. For sdist step, run
    python -m build -n -s
     
  2. For wheel step run
    python -m build -n -w
     

Step 5: Uploading the distribution archives

1. Create accounts in PyPi and TestPyPi.

2. Create 

.pypirc
file in your $HOME directory. The content should be as follows:

3. Install twine like

pipenv install twine --dev
 .

4. Upload your package to PyPiTest. From inside the virtual environment run:

python -m twine upload --repository testpypi dist/*

5. Install your package from PyPiTest:

pipenv install -i https://test.pypi.org/simple/ <your-package-name>

Closing thoughts

In this post, we learned the following things about packaging in Python:

  1. How packaging standards have evolved?
  2. How to adopt PyPi’s recommended guidelines to package a Python library in a virtual environment?

Also Published Here