2018-11-07 20:16:16 -05:00
|
|
|
#!/usr/bin/env python
|
|
|
|
|
# pip installs packages using pinned package versions. If CERTBOT_OLDEST is set
|
2021-07-22 15:00:30 -04:00
|
|
|
# to 1, tools/oldest_constraints.txt is used, otherwise, tools/requirements.txt
|
|
|
|
|
# is used.
|
2018-11-07 20:16:16 -05:00
|
|
|
|
2019-12-09 15:50:20 -05:00
|
|
|
from __future__ import absolute_import
|
|
|
|
|
from __future__ import print_function
|
2018-11-07 20:16:16 -05:00
|
|
|
|
|
|
|
|
import os
|
2019-12-09 15:50:20 -05:00
|
|
|
import subprocess
|
|
|
|
|
import sys
|
2018-11-07 20:16:16 -05:00
|
|
|
import tempfile
|
|
|
|
|
|
|
|
|
|
import readlink
|
|
|
|
|
|
Update tools/venv3.py to support py launcher on Windows (#6493)
Following some inconsistencies occurred during by developments, and in the light of #6508, it decided to wrote a PR that will take fully advantage of the conversion from bash to python to the development setup tools.
This PR adresses several issues when trying to use the development setup tools (`tools/venv.py` and `tools/venv3.py`:
* on Windows, `python` executable is not always in PATH (default behavior)
* even if the option is checked, the `python` executable is not associated to the usually symlink `python3` on Windows
* on Windows again, really powerful introspection of the available Python environments can be done with `py`, the Windows Python launcher
* in general for all systems, `tools/venv.py` and `tools/venv3.py` ensures that the respective Python major version will be used to setup the virtual environment if available.
* finally, the best and first candidate to test should be the Python executable used to launch the `tools/venv*.py` script. It was not relevant before because it was shell scripts, but do it is.
The logic is shared in `_venv_common.py`, and will be called appropriately for both scripts. In priority decreasing order, python executable will be search and tested:
* from the current Python executable, as exposed by `sys.executable`
* from any python or pythonX (X as a python version like 2, 3 or 2.7 or 3.4) executable available in PATH
* from the Windows Python launched `py` if available
Individual changes were:
* Update tools/venv3.py to support py launcher on Windows
* Fix typo in help message
* More explicit calls with space protection
* Complete refactoring to take advantage of the python runtime, and control of the compatible version to use.
2018-11-15 18:17:36 -05:00
|
|
|
|
2018-11-07 20:16:16 -05:00
|
|
|
def find_tools_path():
|
|
|
|
|
return os.path.dirname(readlink.main(__file__))
|
|
|
|
|
|
Update tools/venv3.py to support py launcher on Windows (#6493)
Following some inconsistencies occurred during by developments, and in the light of #6508, it decided to wrote a PR that will take fully advantage of the conversion from bash to python to the development setup tools.
This PR adresses several issues when trying to use the development setup tools (`tools/venv.py` and `tools/venv3.py`:
* on Windows, `python` executable is not always in PATH (default behavior)
* even if the option is checked, the `python` executable is not associated to the usually symlink `python3` on Windows
* on Windows again, really powerful introspection of the available Python environments can be done with `py`, the Windows Python launcher
* in general for all systems, `tools/venv.py` and `tools/venv3.py` ensures that the respective Python major version will be used to setup the virtual environment if available.
* finally, the best and first candidate to test should be the Python executable used to launch the `tools/venv*.py` script. It was not relevant before because it was shell scripts, but do it is.
The logic is shared in `_venv_common.py`, and will be called appropriately for both scripts. In priority decreasing order, python executable will be search and tested:
* from the current Python executable, as exposed by `sys.executable`
* from any python or pythonX (X as a python version like 2, 3 or 2.7 or 3.4) executable available in PATH
* from the Windows Python launched `py` if available
Individual changes were:
* Update tools/venv3.py to support py launcher on Windows
* Fix typo in help message
* More explicit calls with space protection
* Complete refactoring to take advantage of the python runtime, and control of the compatible version to use.
2018-11-15 18:17:36 -05:00
|
|
|
|
Enable again build isolation with proper pinning of build dependencies (#8443)
Fixes #8256
First let's sum up the problem to solve. We disabled the build isolation available in pip>=19 because it could potential break certbot build without a control on our side. Basically builds are not reproductible. Indeed the build isolation triggers build of PEP-517 enabled transitive dependencies (like `cryptography`) with the build dependencies defined in their `pyproject.toml`. For `cryptography` in particular these requirements include `setuptools>=40.6.0`, and quite logically pip will install the latest version of `setuptools` for the build. And when `setuptools` broke with the version 50, our build did the same.
But disabling the build isolation is not a long term solution, as more and more project will migrate on this approach and it basically provides a lot of benefit in how dependencies are built.
The ideal solution would be to be able to apply version constraints on our side on the build dependencies, in order to pin `setuptools` for instance, and decide precisely when we upgrade to a newer version. However for now pip does not provide a mechanism for that (like a `--build-constraint` flag or propagation of existing `--constraint` flag).
Until I saw https://github.com/pypa/pip/issues/9081 and https://github.com/pypa/pip/issues/8439.
Apart the fact that https://github.com/pypa/pip/issues/9081 shows that pip maintainers are working on this issue, it explains how pip works regarding PEP-517 and infers which workaround can be used to still pin the build dependencies. It turns out that pip invokes itself in each build isolation to install the build dependencies. It means that even if some flags (like `--constraint`) are not explicitly passed to the pip sub call, the global environment remains, in particular the environment variables.
Thus it is known that every pip flag can alternatively be set by environment variable using the following pattern for the variable name: `PIP_[FLAG_NAME_UPPERCASE]`. So for `--constraint`, it is `PIP_CONSTRAINT`. And so you can pass a constraint file to the pip sub call through that mechanism.
I made some tests with a constraint file containing pinning for `setuptools`: indeed under isolation zone, the constraint file has been honored and the provided pinned version has been used to build the dependencies (I tested it with `cryptography`).
Finally this PR takes advantage of this mechanism, by setting `PIP_CONSTRAINT` to `pip_install`, the snap building process, the Dockerfiles and the windows installer building process.
I also extracted out the requirements of the new `pipstrap.py` to be reusable in these various build processes.
* Use workaround to fix build requirements in build isolation, and renable build isolation
* Clean imports in pipstrap
* Externalize pipstrap reqs to be reusable
* Inject pipstrap constraints during pip_install
* Update docker build
* Update snapcraft build
* Prepare installer build
* Fix pipstrap constraints in snap build
* Add back --no-build-cache option in Docker images build
* Update snap/snapcraft.yaml
* Use proper flags with pip
Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2020-12-16 13:49:31 -05:00
|
|
|
def call_with_print(command, env=None):
|
|
|
|
|
if not env:
|
|
|
|
|
env = os.environ
|
2018-11-07 20:16:16 -05:00
|
|
|
print(command)
|
Enable again build isolation with proper pinning of build dependencies (#8443)
Fixes #8256
First let's sum up the problem to solve. We disabled the build isolation available in pip>=19 because it could potential break certbot build without a control on our side. Basically builds are not reproductible. Indeed the build isolation triggers build of PEP-517 enabled transitive dependencies (like `cryptography`) with the build dependencies defined in their `pyproject.toml`. For `cryptography` in particular these requirements include `setuptools>=40.6.0`, and quite logically pip will install the latest version of `setuptools` for the build. And when `setuptools` broke with the version 50, our build did the same.
But disabling the build isolation is not a long term solution, as more and more project will migrate on this approach and it basically provides a lot of benefit in how dependencies are built.
The ideal solution would be to be able to apply version constraints on our side on the build dependencies, in order to pin `setuptools` for instance, and decide precisely when we upgrade to a newer version. However for now pip does not provide a mechanism for that (like a `--build-constraint` flag or propagation of existing `--constraint` flag).
Until I saw https://github.com/pypa/pip/issues/9081 and https://github.com/pypa/pip/issues/8439.
Apart the fact that https://github.com/pypa/pip/issues/9081 shows that pip maintainers are working on this issue, it explains how pip works regarding PEP-517 and infers which workaround can be used to still pin the build dependencies. It turns out that pip invokes itself in each build isolation to install the build dependencies. It means that even if some flags (like `--constraint`) are not explicitly passed to the pip sub call, the global environment remains, in particular the environment variables.
Thus it is known that every pip flag can alternatively be set by environment variable using the following pattern for the variable name: `PIP_[FLAG_NAME_UPPERCASE]`. So for `--constraint`, it is `PIP_CONSTRAINT`. And so you can pass a constraint file to the pip sub call through that mechanism.
I made some tests with a constraint file containing pinning for `setuptools`: indeed under isolation zone, the constraint file has been honored and the provided pinned version has been used to build the dependencies (I tested it with `cryptography`).
Finally this PR takes advantage of this mechanism, by setting `PIP_CONSTRAINT` to `pip_install`, the snap building process, the Dockerfiles and the windows installer building process.
I also extracted out the requirements of the new `pipstrap.py` to be reusable in these various build processes.
* Use workaround to fix build requirements in build isolation, and renable build isolation
* Clean imports in pipstrap
* Externalize pipstrap reqs to be reusable
* Inject pipstrap constraints during pip_install
* Update docker build
* Update snapcraft build
* Prepare installer build
* Fix pipstrap constraints in snap build
* Add back --no-build-cache option in Docker images build
* Update snap/snapcraft.yaml
* Use proper flags with pip
Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2020-12-16 13:49:31 -05:00
|
|
|
subprocess.check_call(command, shell=True, env=env)
|
2018-11-07 20:16:16 -05:00
|
|
|
|
Update tools/venv3.py to support py launcher on Windows (#6493)
Following some inconsistencies occurred during by developments, and in the light of #6508, it decided to wrote a PR that will take fully advantage of the conversion from bash to python to the development setup tools.
This PR adresses several issues when trying to use the development setup tools (`tools/venv.py` and `tools/venv3.py`:
* on Windows, `python` executable is not always in PATH (default behavior)
* even if the option is checked, the `python` executable is not associated to the usually symlink `python3` on Windows
* on Windows again, really powerful introspection of the available Python environments can be done with `py`, the Windows Python launcher
* in general for all systems, `tools/venv.py` and `tools/venv3.py` ensures that the respective Python major version will be used to setup the virtual environment if available.
* finally, the best and first candidate to test should be the Python executable used to launch the `tools/venv*.py` script. It was not relevant before because it was shell scripts, but do it is.
The logic is shared in `_venv_common.py`, and will be called appropriately for both scripts. In priority decreasing order, python executable will be search and tested:
* from the current Python executable, as exposed by `sys.executable`
* from any python or pythonX (X as a python version like 2, 3 or 2.7 or 3.4) executable available in PATH
* from the Windows Python launched `py` if available
Individual changes were:
* Update tools/venv3.py to support py launcher on Windows
* Fix typo in help message
* More explicit calls with space protection
* Complete refactoring to take advantage of the python runtime, and control of the compatible version to use.
2018-11-15 18:17:36 -05:00
|
|
|
|
Enable again build isolation with proper pinning of build dependencies (#8443)
Fixes #8256
First let's sum up the problem to solve. We disabled the build isolation available in pip>=19 because it could potential break certbot build without a control on our side. Basically builds are not reproductible. Indeed the build isolation triggers build of PEP-517 enabled transitive dependencies (like `cryptography`) with the build dependencies defined in their `pyproject.toml`. For `cryptography` in particular these requirements include `setuptools>=40.6.0`, and quite logically pip will install the latest version of `setuptools` for the build. And when `setuptools` broke with the version 50, our build did the same.
But disabling the build isolation is not a long term solution, as more and more project will migrate on this approach and it basically provides a lot of benefit in how dependencies are built.
The ideal solution would be to be able to apply version constraints on our side on the build dependencies, in order to pin `setuptools` for instance, and decide precisely when we upgrade to a newer version. However for now pip does not provide a mechanism for that (like a `--build-constraint` flag or propagation of existing `--constraint` flag).
Until I saw https://github.com/pypa/pip/issues/9081 and https://github.com/pypa/pip/issues/8439.
Apart the fact that https://github.com/pypa/pip/issues/9081 shows that pip maintainers are working on this issue, it explains how pip works regarding PEP-517 and infers which workaround can be used to still pin the build dependencies. It turns out that pip invokes itself in each build isolation to install the build dependencies. It means that even if some flags (like `--constraint`) are not explicitly passed to the pip sub call, the global environment remains, in particular the environment variables.
Thus it is known that every pip flag can alternatively be set by environment variable using the following pattern for the variable name: `PIP_[FLAG_NAME_UPPERCASE]`. So for `--constraint`, it is `PIP_CONSTRAINT`. And so you can pass a constraint file to the pip sub call through that mechanism.
I made some tests with a constraint file containing pinning for `setuptools`: indeed under isolation zone, the constraint file has been honored and the provided pinned version has been used to build the dependencies (I tested it with `cryptography`).
Finally this PR takes advantage of this mechanism, by setting `PIP_CONSTRAINT` to `pip_install`, the snap building process, the Dockerfiles and the windows installer building process.
I also extracted out the requirements of the new `pipstrap.py` to be reusable in these various build processes.
* Use workaround to fix build requirements in build isolation, and renable build isolation
* Clean imports in pipstrap
* Externalize pipstrap reqs to be reusable
* Inject pipstrap constraints during pip_install
* Update docker build
* Update snapcraft build
* Prepare installer build
* Fix pipstrap constraints in snap build
* Add back --no-build-cache option in Docker images build
* Update snap/snapcraft.yaml
* Use proper flags with pip
Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2020-12-16 13:49:31 -05:00
|
|
|
def pip_install_with_print(args_str, env=None):
|
|
|
|
|
if not env:
|
|
|
|
|
env = os.environ
|
|
|
|
|
command = ['"', sys.executable, '" -m pip install --disable-pip-version-check ', args_str]
|
|
|
|
|
call_with_print(''.join(command), env=env)
|
2019-04-09 16:39:41 -04:00
|
|
|
|
|
|
|
|
|
2018-11-07 20:16:16 -05:00
|
|
|
def main(args):
|
|
|
|
|
tools_path = find_tools_path()
|
[URGENT] Fix the CI system (#6485)
It is about the exit codes that are returned from the various scripts in tools during tox execution.
Indeed, tox relies on the non-zero exit code from a given script to know that something failed during the execution.
Previously, theses scripts were in bash, and a bash script returns an exit code that is the higher code returned from any of the command executed by the script. So if any command return a non-zero (in particular pylint or pytest), then the script return also non-zero.
Now that these scripts are converted into python, pylint and pytest are executed via subprocess, that returns the exit code as variables. But if theses codes are not handled explicitly, the python script itself will return zero if no python exception occured. As a consequence currently, Certbot CI system is unable to detect any test error or lint error, because there is no exception in this case, only exit codes from the binaries executed.
This PR fixes that, by handling correctly the exit code from the most critical scripts, install_and_test.py and tox.cover.py, but also all the scripts that I converted into Python and that could be executed in the context of a shell (via tox or directly for instance).
2018-11-08 11:35:07 -05:00
|
|
|
|
2021-07-22 15:00:30 -04:00
|
|
|
with tempfile.TemporaryDirectory() as working_dir:
|
2019-01-17 16:02:35 -05:00
|
|
|
if os.environ.get('CERTBOT_NO_PIN') == '1':
|
|
|
|
|
# With unpinned dependencies, there is no constraint
|
2019-04-09 16:39:41 -04:00
|
|
|
pip_install_with_print(' '.join(args))
|
2018-11-07 20:16:16 -05:00
|
|
|
else:
|
2021-07-22 15:00:30 -04:00
|
|
|
# Otherwise, we pick the constraints file based on the environment
|
|
|
|
|
# variable CERTBOT_OLDEST.
|
|
|
|
|
repo_path = os.path.dirname(tools_path)
|
2019-01-17 16:02:35 -05:00
|
|
|
if os.environ.get('CERTBOT_OLDEST') == '1':
|
2021-07-22 15:00:30 -04:00
|
|
|
constraints_path = os.path.normpath(os.path.join(
|
|
|
|
|
repo_path, 'tools', 'oldest_constraints.txt'))
|
2019-01-17 16:02:35 -05:00
|
|
|
else:
|
2021-07-22 15:00:30 -04:00
|
|
|
constraints_path = os.path.normpath(os.path.join(
|
|
|
|
|
repo_path, 'tools', 'requirements.txt'))
|
2019-01-17 16:02:35 -05:00
|
|
|
|
Enable again build isolation with proper pinning of build dependencies (#8443)
Fixes #8256
First let's sum up the problem to solve. We disabled the build isolation available in pip>=19 because it could potential break certbot build without a control on our side. Basically builds are not reproductible. Indeed the build isolation triggers build of PEP-517 enabled transitive dependencies (like `cryptography`) with the build dependencies defined in their `pyproject.toml`. For `cryptography` in particular these requirements include `setuptools>=40.6.0`, and quite logically pip will install the latest version of `setuptools` for the build. And when `setuptools` broke with the version 50, our build did the same.
But disabling the build isolation is not a long term solution, as more and more project will migrate on this approach and it basically provides a lot of benefit in how dependencies are built.
The ideal solution would be to be able to apply version constraints on our side on the build dependencies, in order to pin `setuptools` for instance, and decide precisely when we upgrade to a newer version. However for now pip does not provide a mechanism for that (like a `--build-constraint` flag or propagation of existing `--constraint` flag).
Until I saw https://github.com/pypa/pip/issues/9081 and https://github.com/pypa/pip/issues/8439.
Apart the fact that https://github.com/pypa/pip/issues/9081 shows that pip maintainers are working on this issue, it explains how pip works regarding PEP-517 and infers which workaround can be used to still pin the build dependencies. It turns out that pip invokes itself in each build isolation to install the build dependencies. It means that even if some flags (like `--constraint`) are not explicitly passed to the pip sub call, the global environment remains, in particular the environment variables.
Thus it is known that every pip flag can alternatively be set by environment variable using the following pattern for the variable name: `PIP_[FLAG_NAME_UPPERCASE]`. So for `--constraint`, it is `PIP_CONSTRAINT`. And so you can pass a constraint file to the pip sub call through that mechanism.
I made some tests with a constraint file containing pinning for `setuptools`: indeed under isolation zone, the constraint file has been honored and the provided pinned version has been used to build the dependencies (I tested it with `cryptography`).
Finally this PR takes advantage of this mechanism, by setting `PIP_CONSTRAINT` to `pip_install`, the snap building process, the Dockerfiles and the windows installer building process.
I also extracted out the requirements of the new `pipstrap.py` to be reusable in these various build processes.
* Use workaround to fix build requirements in build isolation, and renable build isolation
* Clean imports in pipstrap
* Externalize pipstrap reqs to be reusable
* Inject pipstrap constraints during pip_install
* Update docker build
* Update snapcraft build
* Prepare installer build
* Fix pipstrap constraints in snap build
* Add back --no-build-cache option in Docker images build
* Update snap/snapcraft.yaml
* Use proper flags with pip
Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2020-12-16 13:49:31 -05:00
|
|
|
env = os.environ.copy()
|
2021-03-26 02:51:59 -04:00
|
|
|
env["PIP_CONSTRAINT"] = constraints_path
|
Enable again build isolation with proper pinning of build dependencies (#8443)
Fixes #8256
First let's sum up the problem to solve. We disabled the build isolation available in pip>=19 because it could potential break certbot build without a control on our side. Basically builds are not reproductible. Indeed the build isolation triggers build of PEP-517 enabled transitive dependencies (like `cryptography`) with the build dependencies defined in their `pyproject.toml`. For `cryptography` in particular these requirements include `setuptools>=40.6.0`, and quite logically pip will install the latest version of `setuptools` for the build. And when `setuptools` broke with the version 50, our build did the same.
But disabling the build isolation is not a long term solution, as more and more project will migrate on this approach and it basically provides a lot of benefit in how dependencies are built.
The ideal solution would be to be able to apply version constraints on our side on the build dependencies, in order to pin `setuptools` for instance, and decide precisely when we upgrade to a newer version. However for now pip does not provide a mechanism for that (like a `--build-constraint` flag or propagation of existing `--constraint` flag).
Until I saw https://github.com/pypa/pip/issues/9081 and https://github.com/pypa/pip/issues/8439.
Apart the fact that https://github.com/pypa/pip/issues/9081 shows that pip maintainers are working on this issue, it explains how pip works regarding PEP-517 and infers which workaround can be used to still pin the build dependencies. It turns out that pip invokes itself in each build isolation to install the build dependencies. It means that even if some flags (like `--constraint`) are not explicitly passed to the pip sub call, the global environment remains, in particular the environment variables.
Thus it is known that every pip flag can alternatively be set by environment variable using the following pattern for the variable name: `PIP_[FLAG_NAME_UPPERCASE]`. So for `--constraint`, it is `PIP_CONSTRAINT`. And so you can pass a constraint file to the pip sub call through that mechanism.
I made some tests with a constraint file containing pinning for `setuptools`: indeed under isolation zone, the constraint file has been honored and the provided pinned version has been used to build the dependencies (I tested it with `cryptography`).
Finally this PR takes advantage of this mechanism, by setting `PIP_CONSTRAINT` to `pip_install`, the snap building process, the Dockerfiles and the windows installer building process.
I also extracted out the requirements of the new `pipstrap.py` to be reusable in these various build processes.
* Use workaround to fix build requirements in build isolation, and renable build isolation
* Clean imports in pipstrap
* Externalize pipstrap reqs to be reusable
* Inject pipstrap constraints during pip_install
* Update docker build
* Update snapcraft build
* Prepare installer build
* Fix pipstrap constraints in snap build
* Add back --no-build-cache option in Docker images build
* Update snap/snapcraft.yaml
* Use proper flags with pip
Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2020-12-16 13:49:31 -05:00
|
|
|
|
|
|
|
|
pip_install_with_print(' '.join(args), env=env)
|
2018-11-07 20:16:16 -05:00
|
|
|
|
Update tools/venv3.py to support py launcher on Windows (#6493)
Following some inconsistencies occurred during by developments, and in the light of #6508, it decided to wrote a PR that will take fully advantage of the conversion from bash to python to the development setup tools.
This PR adresses several issues when trying to use the development setup tools (`tools/venv.py` and `tools/venv3.py`:
* on Windows, `python` executable is not always in PATH (default behavior)
* even if the option is checked, the `python` executable is not associated to the usually symlink `python3` on Windows
* on Windows again, really powerful introspection of the available Python environments can be done with `py`, the Windows Python launcher
* in general for all systems, `tools/venv.py` and `tools/venv3.py` ensures that the respective Python major version will be used to setup the virtual environment if available.
* finally, the best and first candidate to test should be the Python executable used to launch the `tools/venv*.py` script. It was not relevant before because it was shell scripts, but do it is.
The logic is shared in `_venv_common.py`, and will be called appropriately for both scripts. In priority decreasing order, python executable will be search and tested:
* from the current Python executable, as exposed by `sys.executable`
* from any python or pythonX (X as a python version like 2, 3 or 2.7 or 3.4) executable available in PATH
* from the Windows Python launched `py` if available
Individual changes were:
* Update tools/venv3.py to support py launcher on Windows
* Fix typo in help message
* More explicit calls with space protection
* Complete refactoring to take advantage of the python runtime, and control of the compatible version to use.
2018-11-15 18:17:36 -05:00
|
|
|
|
2018-11-07 20:16:16 -05:00
|
|
|
if __name__ == '__main__':
|
2018-11-14 16:57:40 -05:00
|
|
|
main(sys.argv[1:])
|