Age | Commit message (Collapse) | Author |
|
am: e398bc2bf7 am: a1ccc8dd9e am: f26ff44a64 am: 3e21f23d94
Original change: https://android-review.googlesource.com/c/platform/external/bazelbuild-rules_python/+/2673976
Change-Id: Ib33cbb9758fb3c0287c21cf502885fec2f072343
Signed-off-by: Automerger Merge Worker <android-build-automerger-merge-worker@system.gserviceaccount.com>
|
|
am: e398bc2bf7 am: a1ccc8dd9e am: f26ff44a64
Original change: https://android-review.googlesource.com/c/platform/external/bazelbuild-rules_python/+/2673976
Change-Id: I9b28216e47bf0a52569385355e1061dc4e8a0d57
Signed-off-by: Automerger Merge Worker <android-build-automerger-merge-worker@system.gserviceaccount.com>
|
|
am: e398bc2bf7 am: a1ccc8dd9e
Original change: https://android-review.googlesource.com/c/platform/external/bazelbuild-rules_python/+/2673976
Change-Id: Iaf7f65c33c11c5254bbf31de5fd1b458ea15b10e
Signed-off-by: Automerger Merge Worker <android-build-automerger-merge-worker@system.gserviceaccount.com>
|
|
am: e398bc2bf7
Original change: https://android-review.googlesource.com/c/platform/external/bazelbuild-rules_python/+/2673976
Change-Id: I3a623e11ef6f52094e567b9d7a933e43cdbdd48d
Signed-off-by: Automerger Merge Worker <android-build-automerger-merge-worker@system.gserviceaccount.com>
|
|
Original change: https://android-review.googlesource.com/c/platform/external/bazelbuild-rules_python/+/2673976
Change-Id: Iab88ee925f9b83f6e71af634abb439ffd9242290
Signed-off-by: Automerger Merge Worker <android-build-automerger-merge-worker@system.gserviceaccount.com>
|
|
Add three more files
- METADATA
- MODULE_LICENSE_APACHE2
- OWNERS
Test: N/A
Bug: 290219820
Change-Id: Ic4dccf130d29e6963332b212d8aa071b24629f58
|
|
|
|
Before this PR there would be at least a few places where we would be
converting a `X.Y.Z` version string to a shortened `X_Y` or `XY` string
segment
to be used in repository rule labels. This PR adds a small utility
function
that helps making things consistent.
Work towards #1262, split from #1294.
|
|
Before this PR there were at least 2 places where such a helper function
existed and it made it very easy to make another copy. This PR
introduces a
hardened version, that follows conventions from upstream PyPI and tests
have
been added.
Split from #1294, work towards #1262.
|
|
This fixes a bug where the version-aware rules required `main` to always
be explicitly
specified. This was necessary because the main file is named after the
outer target
(e.g. "foo"), but usage of the main file is done by the inner target
("_foo"). The net
effect is the inner target looks for "_foo.py", while only "foo.py" is
in srcs.
To fix, the wrappers set main, if it isn't already set, to their name +
".py"
Work towards #1262
|
|
(#1319)
This just adds a no-op comment to defs.bzl to make patching its load
statements easier.
Rather than looking for the last load (or the conveniently loaded "#
Export ..." comment),
just have an explicit comment for it.
|
|
Upcoming Bazel versions enforce testonly-ness through toolchain lookup,
so when `:current_py_cc_headers` depends on (via toolchain lookup) a
`py_cc_toolchain(testonly=True)` target, an error occurs.
To fix, just remove testonly=True from the toolchain implementation.
Fixes #1324
|
|
The project name was misspelled as "rule_python"; corrected to
"rules_python"
|
|
Also add a note that bzlmod support is still beta.
|
|
* Use public APIs for DictSubject and StrSubject.
* Use rules_testing's StructSubject instead of our own.
Work towards 1297
|
|
This uses the 0.4.0 release of rules_python, which has several features
we can make use of
* Various internal APIs have been made public
* target_compatible_with can be set to skip tests by platform
* Unit tests are easier to write
Also adds rules_license 0.0.7, which is a dependency of rules_testing.
Work towards #1297
|
|
Minor change.
I am creating a Python virtualenv with Bazel, and having
`all_data_requirements` for data dependencies just like
`all_requirements` would be very helpful.
|
|
toolchain and interpreter (#1303)
This commit defaults `the pip.parse` `python_version` attribute to the
default version of
Python, as configured by the `python.toolchain` extension. This allows a
user to use the
Python version set by rules_python or the root module. Also, this makes
setting the
attribute optional (as it has a default) and we automatically select the
interpreter.
Closes #1267
|
|
(#1304)
I think this name is more informative for a public API. The
functionality it exposes are rules/macros that use a specific Python
version to be used. These aren't really aliases.
This commit renames "python_aliases" to "python_versions". This isn't
technically a breaking
change because bzlmod support is still beta, but we'll flag it as such
just in case.
BREAKING CHANGE:
* The `python_aliases` repo is renamed to `python_versions`. You will
need to either
update references from `@python_aliases` to `@python_versions`, or use
repo-remapping
to alias the old name (`use_repo(python,
python_aliases="python_versions")`)
Closes #1273
|
|
Various parts of the codebase detect whether bzlmod is enabled or not.
Most of them copy/paste the same if @ in Label(..) trick and use a
comment to explain what they're doing.
Rather than copy/paste that everywhere, this commit uses a constant
defined that does this once and reuses the constant value to determine
if bzlmod is enabled.
Closes: https://github.com/bazelbuild/rules_python/issues/1295
|
|
Add the new parameter `generate_hashes` (default True) to
`compile_pip_requirements()`, letting the user control whether to put
`--hash` entries in the requirements lock file generated. In particular
if the generated file is supposed to be used as a constraints file the
hashes don't make much sense.
Fixes bazelbuild/rules_python#894.
|
|
(#1301)
For certain workflows it is useful to calculate the integrity hash of
the manifest file based on a number of requirements files. The
requirements locking is usually done by executing a script on each
platform and having gazelle manifest generator be aware that more than
one requirements file may affect the outcome (e.g. the wheels that get
passed to modules map may come from multi_pip_parse rule) is generally
useful.
This change modifies the generation macro to concatenate the
requirements
files into one before passing it to the manifest generator.
|
|
This commit implements a bzlmod extension that allows users to create
"annotations" for wheel builds. The wheel_builder.py accepts a JSON file
via a parameter called annotations; this extension creates those JSON
files. The pip extension accepts a Label -> String dict argument of the
JSON files.
This feature is renamed to `whl_mods` because the JSON files are handled
differently
and the name "annotations" is uninformative. This modifies the creation
of the BUILD
files and their content, and is much more than just adding some notes
about a whl.
The whl_mod extension wheel names and the wheel names in pip must match.
Closes: https://github.com/bazelbuild/rules_python/issues/1213
|
|
This allows getting a build's `cc_library` of Python headers through
toolchain resolution instead of having to use the underlying toolchain's
repository `:python_headers` target directly.
Without this feature, it's not possible to reliably and correctly get
the C information about the runtime a build is going to use. Existing
solutions require carefully setting up repo names, external state,
and/or using specific build rules. In comparison, with this feature,
consumers are able to simply ask for the current headers via a helper
target or manually lookup the toolchain and pull the relevant
information; toolchain resolution handles finding the correct headers.
The basic way this works is by registering a second toolchain to carry
C/C++ related information; as such, it is named `py_cc_toolchain`. The
py cc toolchain has the same constraint settings as the regular py
toolchain; an expected invariant is that there is a 1:1 correspondence
between the two. This base functionality allows a consuming rule
implementation to use toolchain resolution to find the Python C
toolchain information.
Usually what downstream consumers need are the headers to feed into
another `cc_library` (or equivalent) target, so, rather than have every
project re-implement the same "lookup and forward cc_library info"
logic,
this is provided by the `//python/cc:current_py_cc_headers` target.
Targets that need the headers can then depend on that target as if it
was a `cc_library` target.
Work towards https://github.com/bazelbuild/rules_python/issues/824
|
|
Currently, Python toolchain `:files` and `:py3_runtime` include some
unnecessary files. This is because these globs result in literally
`lib/libpython{python_version}.so` etc., which do not match anything.
The formatting needs to be applied here since it will not be applied
later.
I believe this was introduced by
a47c6cd681b34b1ad990ed40dcc01ab5f024406a.
|
|
Before this PR the `coverage_tool` automatically registered by
`rules_python`
was visible outside the toolchain repository. This fixes it to be
consistent
with `non-bzlmod` setups and ensures that the default `coverage_tool` is
not
visible outside the toolchain repos.
This means that the `MODULE.bazel` file can be cleaned-up at the expense
of
relaxing the `coverage_tool` attribute for the `python_repository` to be
a
simple string as the label would be evaluated within the context of
`rules_python` which may not necessarily resolve correctly without the
`use_repo` statement in our `MODULE.bazel`.
|
|
Gazelle v0.30.0 introduced a lifecycle manager. We can use that to start
and shutdown parser and stdmodule processes. So we don't need to use
`init` function or creating `context.WithTimeout`.
BREAKING CHANGES:
This requires the users of this Gazelle extension to upgrade to Gazelle
v0.30.0 or above.
|
|
--incompatible_enable_cc_toolchain_resolution (#1281)
The analysis tests transition to different platforms to test some
platform-specific logic.
When cc toolchain registration is enabled, this also requires that a
more complete
toolchain be defined and available.
|
|
Upgrading to the latest version of gazelle and rules_go. This should
address `--incompatible_config_setting_private_default_visibility` flag.
|
|
* Corrects some typos in docs
* Expands the user-facing documentation
* Fixes errors having newlines in the middle of them
* Renames some internal functions to be more self-describing.
|
|
plugin_output was wrong in case multiple repositories are involved
and/or _virtual_imports.
The code is taken from `cc_proto_library` and has been verified in
practice.
|
|
This adds some basic analysis tests of py_wheel to cover the
functionality it's recently introduced.
|
|
`Project-URL` is a field available in core metadata since version 1.2,
which allows specifying additional URLs and display as Project Links in
PyPI package web page.
https://packaging.python.org/en/latest/specifications/core-metadata/#project-url-multiple-use
This change adds the support to specify that.
|
|
Summary in METADATA (#1274)
`py_wheel` allows to supply a description file, but it does not allow
specifying the content type of that file.
By default, the type is "text/x-rst", but many packages are using a
markdown description.
https://packaging.python.org/en/latest/specifications/core-metadata/#description-content-type
This change added the support.
---------
Co-authored-by: Richard Levasseur <richardlev@gmail.com>
|
|
Also fix missing runfiles on the py_wheel.dist target.
Fixes #1130
Fixes #1270
---------
Co-authored-by: Richard Levasseur <rlevasseur@google.com>
|
|
versions (#1254)
Implementing the capability to call pip bzlmod extension multiple times.
We can now call the pip extension with the same hub name multiple times.
This allows a user to have multiple different requirement files based on
the Python version.
With workspace, we need incompatible aliases because you get
@pip//tabulate or @pip_tabulate//:pkg.
The requirement() macro hides this, but changing the flag becomes a
breaking change if you've manually referenced things. With bzlmod,
though, the @pip_tabulate style isn't a realistic option (you'd have to
use_repo everything, which is a huge pain).
So we have chosen to have `@pip//tabulate`.
This commit implements that capability for pip.parse to determine the
Python version from programmatically
the provided interpreter.
See
https://github.com/bazelbuild/rules_python/blob/76aab0f91bd614ee1b2ade310baf28c38695c522/python/extensions/pip.bzl#L88
For more in-depth implementation details.
INTERFACE CHANGE::
- pip.parse argument python_version is introduced but not required. When
possible, it is
inferred.
BREAKING CHANGE:
* `pip.parse()` extension: the `name` attribute is renamed `hub_name`.
This is to reflect
that the user-provided name isn't unique to each `pip.parse()` call. We
now have a hub that is created, and each
pip.parse call creates a spoke.
* `pip.parse()` extension: the `incompatible_generate_aliases` arg is
removed; behavior
has changed to assume it is always True.
* `pip.parse()` calls are collected under the same `hub_name` to support
multiple Python versions under the same resulting repo name (the hub
name0.
Close: #1229
|
|
When using Windows you cannot run the symlink directly.
Because of how symlinks work in Windows we need to path realpath resolve
the link.
Unlike Linux and OSX we cannot execute the Python symlink directly.
Windows throws an error "-1073741515", when we execute the symlink
directory. This error means that the Python binary cannot find dlls.
This is because the symlink that we create is not in the same folder as
the dlls.
---------
Co-authored-by: Richard Levasseur <richardlev@gmail.com>
|
|
We ignore pyc files most everywhere (because they aren't deterministic),
but part of the pyc creation process involves creating temporary files
named `*.pyc.NNN`. Though these are supposed to be temporary files
nobody sees, they seem to get picked up by a glob somewhere, somehow.
I'm unable to figure out how that is happening, but ignoring them in the
glob expressions should also suffice.
Fixes #1261
|
|
The bzlmod-compatible build_file_generation example was moved to the
bzlmod_build_file_generation example.
This should fix the automatic build/release of the gazelle BCR module.
|
|
This makes rules_python always provide a default toolchain when using
bzlmod.
Note that, unlike workspace builds, the default is not the local
system Python (`@bazel_tools//tools/python:autodetecting_toolchain`).
Instead, the default is a hermetic runtime, but no guarantees are made
about the particular version used. In practice, it will be the latest
available Python version.
Work towards #1233
|
|
This allows the `distribution` attribute to expand workspace status
keys, just as the
`version` attribute can.
This allows, for example, the VCS's branch name (e.g test, release, etc)
to be part of the
distribution name without having to modify the BUILD file. Having
distinct distribution
names is necessary because tools like pip-compile, which determine
version compatibility
and replacements, and having the same distribution name would mean the
different builds
are seen as equivalent.
Closes #1250
|
|
[`wheel_installer` assumes that if the pip command succeeded, there must
be at least one `*.whl`
file](https://github.com/lpulley/rules_python/blob/fdec44120aa45d748ab804f1d019002c6949b449/python/pip_install/tools/wheel_installer/wheel_installer.py#L439),
but this is not currently true when `download_only` is enabled and the
package provides no wheel; a `.tar.gz` will happily be downloaded, pip
will succeed, and the `next(iter(glob.glob("*.whl")))` call will fail,
producing a mysterious `StopIteration`:
```
Saved ./artifactory-0.1.17.tar.gz
Successfully downloaded artifactory
(Traceback (most recent call last):
File "[redacted]/lib/python3.8/runpy.py", line 194, in _run_module_as_main
return _run_code(code, main_globals, None,
File "[redacted]/lib/python3.8/runpy.py", line 87, in _run_code
exec(code, run_globals)
File "[redacted]/external/rules_python/python/pip_install/tools/wheel_installer/wheel_installer.py", line 450, in <module>
main()
File "[redacted]/external/rules_python/python/pip_install/tools/wheel_installer/wheel_installer.py", line 438, in main
whl = next(iter(glob.glob("*.whl")))
StopIteration
)
```
By using `--only-binary=:all:` when using `pip download`, the pip
command will fail if there is no suitable wheel to be downloaded. This
should make the error much more obvious, since with
`--only-binary=:all:` and no suitable wheel, pip fails and reports an
error like this:
```
ERROR: Could not find a version that satisfies the requirement artifactory (from versions: none)
ERROR: No matching distribution found for artifactory
```
|
|
(#1123)
Currently the dependency_resolver.py ignores that you give requirement
lock files for different os's, except when checking if the golden file
needs updating. This causes dependecy_resolver.py to update the wrong
lock i.e the non platform specific one if ran in "update mode".
|
|
The main reasons this is removed is because if modules choose different
names for the same toolchain, only one of the two toolchains (which are,
hopefully, identical) will be used. Which toolchain is used depends on
the module graph dependency ordering.
Furthermore, as of #1238, only one repo per version is created; others
are ignored. This means a module doesn't know if the name it chooses
will result in a repo being created with that name.
Instead, the toolchain repos are named by rules_python:
`python_{major}_{minor}`. These repo names are currently part of the
public API, since they end up referenced in MODULE config (to wire the
toolchain interpreter to pip).
BREAKING CHANGES
This removes the `name` arg from `python.toolchain()`. Users will need
to remove such usages from their `MODULE.bazel` and update their
`use_repo()` statements. If keeping the custom repo name is necessary,
then repo mappings can be used. See #1232 for additional migration
steps, commands, and information.
|
|
python-build-standalone (#1234)
This is being added in order to once again be able to build envoyproxy
on the `ppc64le` architecture.
Little Endian Power support was added to release
https://github.com/indygreg/python-build-standalone/releases/tag/20230507.
Signed-off-by: Christy Norman <christy@linux.vnet.ibm.com>
|
|
Fixes #1196.
Currently the `coverage.py` module does not work if updated to the
latest
version with the following error:
```
ImportError: cannot import name 'MappingProxyType' from partially
initialized module 'types' (most likely due to a circular import)
...~pypi__coverage_cp39_x86_64-unknown-linux-gnu/coverage/types.py)
```
Where the `MappingProxyType` actually exists in Python's std lib. To
fix,
modify sys.path before the first import of coverage.
Summary:
- chore(coverage): bump coverage.py to 7.2.7
- fix(coverage): patch sys.path before importing anything from coverage
- test(coverage): add extra assertions about the module names
|
|
|
|
Currently the users need to run the script manually and this PR adds a
pre-commit hook which should facilitate the maintenance of the deleted
packages within the `rules_python .bazelrc`.
|
|
This fixes an issue where the last submodule, instead of the first, was
given precedence when creating versioned toolchains. To fix, when
creating the map of versioned toolchains, if a version is already
defined, then a subsequent usage is ignored. A warning is emitted when
this occurs.
This also fixes a similar problem that can occur to the root module. If
the root module uses a particular version marked as the default, and is
using the versioned rules, and a submodule also uses that same version,
then the submodule's toolchain would be used. This happened because the
root module's toolchain would be moved last, so its versioned rules
would match the submodule's versioned toolchain.
This also does some cleanup and refactoring to:
* Compute the toolchains in one loop iteration
* Give more informative error messages
* Reject duplicate names within a module, even for the non-root module.
* Reject duplicate versions within a module.
|
|
(#1246)
The generated toolchain BUILD file is confusing to read because it
relies on a ternary expression in the BUILD file to set the
`target_settings` attribute. This makes debugging harder because, upon
first look, all the toolchains appear to have the version constraint
set. It's only upon closer inspection that you can see the 1-character
difference of "False" vs "True" embedded into the middle of a line
amongst other similar looking lines.
Also:
* Adds a bit of validation logic for the `set_python_version_constraint`
argument
because it's conceptually a boolean, but is passed as a string, so is
prone to
having an incorrect value passed.
* Documents the `set_python_version_constraint` arg, since it has a
particular range
of values it accepts.
|