# Authentication ## Git authentication uv allows packages to be installed from Git and supports the following schemes for authenticating with private repositories. Using SSH: - `git+ssh://git@/...` (e.g., `git+ssh://git@github.com/astral-sh/uv`) - `git+ssh://git@/...` (e.g., `git+ssh://git@github.com-key-2/astral-sh/uv`) See the [GitHub SSH documentation](https://docs.github.com/en/authentication/connecting-to-github-with-ssh/about-ssh) for more details on how to configure SSH. Using a password or token: - `git+https://:@/...` (e.g., `git+https://git:github_pat_asdf@github.com/astral-sh/uv`) - `git+https://@/...` (e.g., `git+https://github_pat_asdf@github.com/astral-sh/uv`) - `git+https://@/...` (e.g., `git+https://git@github.com/astral-sh/uv`) When using a GitHub personal access token, the username is arbitrary. GitHub doesn't allow you to use your account name and password in URLs like this, although other hosts may. If there are no credentials present in the URL and authentication is needed, the [Git credential helper](#git-credential-helpers) will be queried. !!! important When using `uv add`, uv _will not_ persist Git credentials to the `pyproject.toml` or `uv.lock`. These files are often included in source control and distributions, so it is generally unsafe to include credentials in them. If you have a Git credential helper configured, your credentials may be automatically persisted, resulting in successful subsequent fetches of the dependency. However, if you do not have a Git credential helper or the project is used on a machine without credentials seeded, uv will fail to fetch the dependency. You _may_ force uv to persist Git credentials by passing the `--raw` option to `uv add`. However, we strongly recommend setting up a [credential helper](#git-credential-helpers) instead. ### Git credential helpers Git credential helpers are used to store and retrieve Git credentials. See the [Git documentation](https://git-scm.com/doc/credential-helpers) to learn more. If you're using GitHub, the simplest way to set up a credential helper is to [install the `gh` CLI](https://github.com/cli/cli#installation) and use: ```console $ gh auth login ``` See the [`gh auth login`](https://cli.github.com/manual/gh_auth_login) documentation for more details. !!! note When using `gh auth login` interactively, the credential helper will be configured automatically. But when using `gh auth login --with-token`, as in the uv [GitHub Actions guide](../guides/integration/github.md#private-repos), the [`gh auth setup-git`](https://cli.github.com/manual/gh_auth_setup-git) command will need to be run afterwards to configure the credential helper. ## HTTP authentication uv supports credentials over HTTP when querying package registries. Authentication can come from the following sources, in order of precedence: - The URL, e.g., `https://:@/...` - A [`.netrc`](https://everything.curl.dev/usingcurl/netrc) configuration file - A [keyring](https://github.com/jaraco/keyring) provider (requires opt-in) If authentication is found for a single index URL or net location (scheme, host, and port), it will be cached for the duration of the command and used for other queries to that index or net location. Authentication is not cached across invocations of uv. `.netrc` authentication is enabled by default, and will respect the `NETRC` environment variable if defined, falling back to `~/.netrc` if not. To enable keyring-based authentication, pass the `--keyring-provider subprocess` command-line argument to uv, or set `UV_KEYRING_PROVIDER=subprocess`. Authentication may be used for hosts specified in the following contexts: - `[index]` - `index-url` - `extra-index-url` - `find-links` - `package @ https://...` See the [index authentication documentation](./indexes.md#authentication) for details on authenticating index URLs. See the [`pip` compatibility guide](../pip/compatibility.md#registry-authentication) for details on differences from `pip`. !!! important When using `uv add`, uv _will not_ persist index credentials to the `pyproject.toml` or `uv.lock`. These files are often included in source control and distributions, so it is generally unsafe to include credentials in them. However, uv _will_ persist credentials for direct URLs, i.e., `package @ https://username:password:example.com/foo.whl`, as there is not currently a way to otherwise provide those credentials. If credentials were attached to an index URL during `uv add`, uv may fail to fetch dependencies from indexes which require authentication on subsequent operations. See the [index authentication documentation](./indexes.md#authentication) for details on persistent authentication for indexes. ## Authentication with alternative package indexes See the [alternative indexes integration guide](../guides/integration/alternative-indexes.md) for details on authentication with popular alternative Python package indexes. ## Custom CA certificates By default, uv loads certificates from the bundled `webpki-roots` crate. The `webpki-roots` are a reliable set of trust roots from Mozilla, and including them in uv improves portability and performance (especially on macOS, where reading the system trust store incurs a significant delay). However, in some cases, you may want to use the platform's native certificate store, especially if you're relying on a corporate trust root (e.g., for a mandatory proxy) that's included in your system's certificate store. To instruct uv to use the system's trust store, run uv with the `--native-tls` command-line flag, or set the `UV_NATIVE_TLS` environment variable to `true`. If a direct path to the certificate is required (e.g., in CI), set the `SSL_CERT_FILE` environment variable to the path of the certificate bundle, to instruct uv to use that file instead of the system's trust store. If client certificate authentication (mTLS) is desired, set the `SSL_CLIENT_CERT` environment variable to the path of the PEM formatted file containing the certificate followed by the private key. Finally, if you're using a setup in which you want to trust a self-signed certificate or otherwise disable certificate verification, you can instruct uv to allow insecure connections to dedicated hosts via the `allow-insecure-host` configuration option. For example, adding the following to `pyproject.toml` will allow insecure connections to `example.com`: ```toml [tool.uv] allow-insecure-host = ["example.com"] ``` `allow-insecure-host` expects to receive a hostname (e.g., `localhost`) or hostname-port pair (e.g., `localhost:8080`), and is only applicable to HTTPS connections, as HTTP connections are inherently insecure. Use `allow-insecure-host` with caution and only in trusted environments, as it can expose you to security risks due to the lack of certificate verification. ## Hugging Face support uv supports automatic authentication for the Hugging Face Hub. Specifically, if the `HF_TOKEN` environment variable is set, uv will propagate it to requests to `huggingface.co`. This is particularly useful for accessing private scripts in Hugging Face Datasets. For example, you can run the following command to execute the script `main.py` script from a private dataset: ```console $ HF_TOKEN=hf_... uv run https://huggingface.co/datasets///resolve//main.py ``` You can disable automatic Hugging Face authentication by setting the `UV_NO_HF_TOKEN=1` environment variable. # The uv build backend A build backend transforms a source tree (i.e., a directory) into a source distribution or a wheel. uv supports all build backends (as specified by [PEP 517](https://peps.python.org/pep-0517/)), but also provides a native build backend (`uv_build`) that integrates tightly with uv to improve performance and user experience. ## Choosing a build backend The uv build backend is a great choice for most Python projects. It has reasonable defaults, with the goal of requiring zero configuration for most users, but provides flexible configuration to accommodate most Python project structures. It integrates tightly with uv, to improve messaging and user experience. It validates project metadata and structures, preventing common mistakes. And, finally, it's very fast. The uv build backend currently **only supports pure Python code**. An alternative backend is required to build a [library with extension modules](../concepts/projects/init.md#projects-with-extension-modules). !!! tip While the backend supports a number of options for configuring your project structure, when build scripts or a more flexible project layout are required, consider using the [hatchling](https://hatch.pypa.io/latest/config/build/#build-system) build backend instead. ## Using the uv build backend To use uv as a build backend in an existing project, add `uv_build` to the [`[build-system]`](../concepts/projects/config.md#build-systems) section in your `pyproject.toml`: ```toml title="pyproject.toml" [build-system] requires = ["uv_build>=0.8.4,<0.9.0"] build-backend = "uv_build" ``` !!! note The uv build backend follows the same [versioning policy](../reference/policies/versioning.md) as uv. Including an upper bound on the `uv_build` version ensures that your package continues to build correctly as new versions are released. To create a new project that uses the uv build backend, use `uv init`: ```console $ uv init ``` When the project is built, e.g., with [`uv build`](../guides/package.md), the uv build backend will be used to create the source distribution and wheel. ## Bundled build backend The build backend is published as a separate package (`uv_build`) that is optimized for portability and small binary size. However, the `uv` executable also includes a copy of the build backend, which will be used during builds performed by uv, e.g., during `uv build`, if its version is compatible with the `uv_build` requirement. If it's not compatible, a compatible version of the `uv_build` package will be used. Other build frontends, such as `python -m build`, will always use the `uv_build` package, typically choosing the latest compatible version. ## Modules Python packages are expected to contain one or more Python modules, which are directories containing an `__init__.py`. By default, a single root module is expected at `src//__init__.py`. For example, the structure for a project named `foo` would be: ```text pyproject.toml src └── foo └── __init__.py ``` uv normalizes the package name to determine the default module name: the package name is lowercased and dots and dashes are replaced with underscores, e.g., `Foo-Bar` would be converted to `foo_bar`. The `src/` directory is the default directory for module discovery. These defaults can be changed with the `module-name` and `module-root` settings. For example, to use a `FOO` module in the root directory, as in the project structure: ```text pyproject.toml FOO └── __init__.py ``` The correct build configuration would be: ```toml title="pyproject.toml" [tool.uv.build-backend] module-name = "FOO" module-root = "" ``` ## Namespace packages Namespace packages are intended for use-cases where multiple packages write modules into a shared namespace. Namespace package modules are identified by a `.` in the `module-name`. For example, to package the module `bar` in the shared namespace `foo`, the project structure would be: ```text pyproject.toml src └── foo └── bar └── __init__.py ``` And the `module-name` configuration would be: ```toml title="pyproject.toml" [tool.uv.build-backend] module-name = "foo.bar" ``` !!! important The `__init__.py` file is not included in `foo`, since it's the shared namespace module. It's also possible to have a complex namespace package with more than one root module, e.g., with the project structure: ```text pyproject.toml src ├── foo │ └── __init__.py └── bar └── __init__.py ``` While we do not recommend this structure (i.e., you should use a workspace with multiple packages instead), it is supported by setting `module-name` to a list of names: ```toml title="pyproject.toml" [tool.uv.build-backend] module-name = ["foo", "bar"] ``` For packages with many modules or complex namespaces, the `namespace = true` option can be used to avoid explicitly declaring each module name, e.g.: ```toml title="pyproject.toml" [tool.uv.build-backend] namespace = true ``` !!! warning Using `namespace = true` disables safety checks. Using an explicit list of module names is strongly recommended outside of legacy projects. The `namespace` option can also be used with `module-name` to explicitly declare the root, e.g., for the project structure: ```text pyproject.toml src └── foo ├── bar │ └── __init__.py └── baz └── __init__.py ``` The recommended configuration would be: ```toml title="pyproject.toml" [tool.uv.build-backend] module-name = "foo" namespace = true ``` ## Stub packages The build backend also supports building type stub packages, which are identified by the `-stubs` suffix on the package or module name, e.g., `foo-stubs`. The module name for type stub packages must end in `-stubs`, so uv will not normalize the `-` to an underscore. Additionally, uv will search for a `__init__.pyi` file. For example, the project structure would be: ```text pyproject.toml src └── foo-stubs └── __init__.pyi ``` Type stub modules are also supported for [namespace packages](#namespace-packages). ## File inclusion and exclusion The build backend is responsible for determining which files in a source tree should be packaged into the distributions. To determine which files to include in a source distribution, uv first adds the included files and directories, then removes the excluded files and directories. This means that exclusions always take precedence over inclusions. By default, uv excludes `__pycache__`, `*.pyc`, and `*.pyo`. When building a source distribution, the following files and directories are included: - The `pyproject.toml` - The [module](#modules) under [`tool.uv.build-backend.module-root`](../reference/settings.md#build-backend_module-root). - The files referenced by `project.license-files` and `project.readme`. - All directories under [`tool.uv.build-backend.data`](../reference/settings.md#build-backend_data). - All files matching patterns from [`tool.uv.build-backend.source-include`](../reference/settings.md#build-backend_source-include). From these, items matching [`tool.uv.build-backend.source-exclude`](../reference/settings.md#build-backend_source-exclude) and the [default excludes](../reference/settings.md#build-backend_default-excludes) are removed. When building a wheel, the following files and directories are included: - The [module](#modules) under [`tool.uv.build-backend.module-root`](../reference/settings.md#build-backend_module-root) - The files referenced by `project.license-files`, which are copied into the `.dist-info` directory. - The `project.readme`, which is copied into the project metadata. - All directories under [`tool.uv.build-backend.data`](../reference/settings.md#build-backend_data), which are copied into the `.data` directory. From these, [`tool.uv.build-backend.source-exclude`](../reference/settings.md#build-backend_source-exclude), [`tool.uv.build-backend.wheel-exclude`](../reference/settings.md#build-backend_wheel-exclude) and the default excludes are removed. The source dist excludes are applied to avoid source tree to wheel source builds including more files than source tree to source distribution to wheel build. There are no specific wheel includes. There must only be one top level module, and all data files must either be under the module root or in the appropriate [data directory](../reference/settings.md#build-backend_data). Most packages store small data in the module root alongside the source code. ### Include and exclude syntax Includes are anchored, which means that `pyproject.toml` includes only `/pyproject.toml` and not `/bar/pyproject.toml`. To recursively include all files under a directory, use a `/**` suffix, e.g. `src/**`. Recursive inclusions are also anchored, e.g., `assets/**/sample.csv` includes all `sample.csv` files in `/assets` or any of its children. !!! note For performance and reproducibility, avoid patterns without an anchor such as `**/sample.csv`. Excludes are not anchored, which means that `__pycache__` excludes all directories named `__pycache__` regardless of its parent directory. All children of an exclusion are excluded as well. To anchor a directory, use a `/` prefix, e.g., `/dist` will exclude only `/dist`. All fields accepting patterns use the reduced portable glob syntax from [PEP 639](https://peps.python.org/pep-0639/#add-license-FILES-key), with the addition that characters can be escaped with a backslash. # Caching ## Dependency caching uv uses aggressive caching to avoid re-downloading (and re-building) dependencies that have already been accessed in prior runs. The specifics of uv's caching semantics vary based on the nature of the dependency: - **For registry dependencies** (like those downloaded from PyPI), uv respects HTTP caching headers. - **For direct URL dependencies**, uv respects HTTP caching headers, and also caches based on the URL itself. - **For Git dependencies**, uv caches based on the fully-resolved Git commit hash. As such, `uv pip compile` will pin Git dependencies to a specific commit hash when writing the resolved dependency set. - **For local dependencies**, uv caches based on the last-modified time of the source archive (i.e., the local `.whl` or `.tar.gz` file). For directories, uv caches based on the last-modified time of the `pyproject.toml`, `setup.py`, or `setup.cfg` file. If you're running into caching issues, uv includes a few escape hatches: - To clear the cache entirely, run `uv cache clean`. To clear the cache for a specific package, run `uv cache clean `. For example, `uv cache clean ruff` will clear the cache for the `ruff` package. - To force uv to revalidate cached data for all dependencies, pass `--refresh` to any command (e.g., `uv sync --refresh` or `uv pip install --refresh ...`). - To force uv to revalidate cached data for a specific dependency pass `--refresh-package` to any command (e.g., `uv sync --refresh-package ruff` or `uv pip install --refresh-package ruff ...`). - To force uv to ignore existing installed versions, pass `--reinstall` to any installation command (e.g., `uv sync --reinstall` or `uv pip install --reinstall ...`). (Consider running `uv cache clean ` first, to ensure that the cache is cleared prior to reinstallation.) As a special case, uv will always rebuild and reinstall any local directory dependencies passed explicitly on the command-line (e.g., `uv pip install .`). ## Dynamic metadata By default, uv will _only_ rebuild and reinstall local directory dependencies (e.g., editables) if the `pyproject.toml`, `setup.py`, or `setup.cfg` file in the directory root has changed, or if a `src` directory is added or removed. This is a heuristic and, in some cases, may lead to fewer re-installs than desired. To incorporate additional information into the cache key for a given package, you can add cache key entries under [`tool.uv.cache-keys`](https://docs.astral.sh/uv/reference/settings/#cache-keys), which covers both file paths and Git commit hashes. Setting [`tool.uv.cache-keys`](https://docs.astral.sh/uv/reference/settings/#cache-keys) will replace defaults, so any necessary files (like `pyproject.toml`) should still be included in the user-defined cache keys. For example, if a project specifies dependencies in `pyproject.toml` but uses [`setuptools-scm`](https://pypi.org/project/setuptools-scm/) to manage its version, and should thus be rebuilt whenever the commit hash or dependencies change, you can add the following to the project's `pyproject.toml`: ```toml title="pyproject.toml" [tool.uv] cache-keys = [{ file = "pyproject.toml" }, { git = { commit = true } }] ``` If your dynamic metadata incorporates information from the set of Git tags, you can expand the cache key to include the tags: ```toml title="pyproject.toml" [tool.uv] cache-keys = [{ file = "pyproject.toml" }, { git = { commit = true, tags = true } }] ``` Similarly, if a project reads from a `requirements.txt` to populate its dependencies, you can add the following to the project's `pyproject.toml`: ```toml title="pyproject.toml" [tool.uv] cache-keys = [{ file = "pyproject.toml" }, { file = "requirements.txt" }] ``` Globs are supported for `file` keys, following the syntax of the [`glob`](https://docs.rs/glob/0.3.1/glob/struct.Pattern.html) crate. For example, to invalidate the cache whenever a `.toml` file in the project directory or any of its subdirectories is modified, use the following: ```toml title="pyproject.toml" [tool.uv] cache-keys = [{ file = "**/*.toml" }] ``` !!! note The use of globs can be expensive, as uv may need to walk the filesystem to determine whether any files have changed. This may, in turn, requiring traversal of large or deeply nested directories. Similarly, if a project relies on an environment variable, you can add the following to the project's `pyproject.toml` to invalidate the cache whenever the environment variable changes: ```toml title="pyproject.toml" [tool.uv] cache-keys = [{ file = "pyproject.toml" }, { env = "MY_ENV_VAR" }] ``` Finally, to invalidate a project whenever a specific directory (like `src`) is created or removed, add the following to the project's `pyproject.toml`: ```toml title="pyproject.toml" [tool.uv] cache-keys = [{ file = "pyproject.toml" }, { dir = "src" }] ``` Note that the `dir` key will only track changes to the directory itself, and not arbitrary changes within the directory. As an escape hatch, if a project uses `dynamic` metadata that isn't covered by `tool.uv.cache-keys`, you can instruct uv to _always_ rebuild and reinstall it by adding the project to the `tool.uv.reinstall-package` list: ```toml title="pyproject.toml" [tool.uv] reinstall-package = ["my-package"] ``` This will force uv to rebuild and reinstall `my-package` on every run, regardless of whether the package's `pyproject.toml`, `setup.py`, or `setup.cfg` file has changed. ## Cache safety It's safe to run multiple uv commands concurrently, even against the same virtual environment. uv's cache is designed to be thread-safe and append-only, and thus robust to multiple concurrent readers and writers. uv applies a file-based lock to the target virtual environment when installing, to avoid concurrent modifications across processes. Note that it's _not_ safe to modify the uv cache (e.g., `uv cache clean`) while other uv commands are running, and _never_ safe to modify the cache directly (e.g., by removing a file or directory). ## Clearing the cache uv provides a few different mechanisms for removing entries from the cache: - `uv cache clean` removes _all_ cache entries from the cache directory, clearing it out entirely. - `uv cache clean ruff` removes all cache entries for the `ruff` package, useful for invalidating the cache for a single or finite set of packages. - `uv cache prune` removes all _unused_ cache entries. For example, the cache directory may contain entries created in previous uv versions that are no longer necessary and can be safely removed. `uv cache prune` is safe to run periodically, to keep the cache directory clean. ## Caching in continuous integration It's common to cache package installation artifacts in continuous integration environments (like GitHub Actions or GitLab CI) to speed up subsequent runs. By default, uv caches both the wheels that it builds from source and the pre-built wheels that it downloads directly, to enable high-performance package installation. However, in continuous integration environments, persisting pre-built wheels may be undesirable. With uv, it turns out that it's often faster to _omit_ pre-built wheels from the cache (and instead re-download them from the registry on each run). On the other hand, caching wheels that are built from source tends to be worthwhile, since the wheel building process can be expensive, especially for extension modules. To support this caching strategy, uv provides a `uv cache prune --ci` command, which removes all pre-built wheels and unzipped source distributions from the cache, but retains any wheels that were built from source. We recommend running `uv cache prune --ci` at the end of your continuous integration job to ensure maximum cache efficiency. For an example, see the [GitHub integration guide](../guides/integration/github.md#caching). ## Cache directory uv determines the cache directory according to, in order: 1. A temporary cache directory, if `--no-cache` was requested. 2. The specific cache directory specified via `--cache-dir`, `UV_CACHE_DIR`, or [`tool.uv.cache-dir`](../reference/settings.md#cache-dir). 3. A system-appropriate cache directory, e.g., `$XDG_CACHE_HOME/uv` or `$HOME/.cache/uv` on Unix and `%LOCALAPPDATA%\uv\cache` on Windows !!! note uv _always_ requires a cache directory. When `--no-cache` is requested, uv will still use a temporary cache for sharing data within that single invocation. In most cases, `--refresh` should be used instead of `--no-cache` — as it will update the cache for subsequent operations but not read from the cache. It is important for performance for the cache directory to be located on the same file system as the Python environment uv is operating on. Otherwise, uv will not be able to link files from the cache into the environment and will instead need to fallback to slow copy operations. ## Cache versioning The uv cache is composed of a number of buckets (e.g., a bucket for wheels, a bucket for source distributions, a bucket for Git repositories, and so on). Each bucket is versioned, such that if a release contains a breaking change to the cache format, uv will not attempt to read from or write to an incompatible cache bucket. For example, uv 0.4.13 included a breaking change to the core metadata bucket. As such, the bucket version was increased from v12 to v13. Within a cache version, changes are guaranteed to be both forwards- and backwards-compatible. Since changes in the cache format are accompanied by changes in the cache version, multiple versions of uv can safely read and write to the same cache directory. However, if the cache version changed between a given pair of uv releases, then those releases may not be able to share the same underlying cache entries. For example, it's safe to use a single shared cache for uv 0.4.12 and uv 0.4.13, though the cache itself may contain duplicate entries in the core metadata bucket due to the change in cache version. # Configuration files uv supports persistent configuration files at both the project- and user-level. Specifically, uv will search for a `pyproject.toml` or `uv.toml` file in the current directory, or in the nearest parent directory. !!! note For `tool` commands, which operate at the user level, local configuration files will be ignored. Instead, uv will exclusively read from user-level configuration (e.g., `~/.config/uv/uv.toml`) and system-level configuration (e.g., `/etc/uv/uv.toml`). In workspaces, uv will begin its search at the workspace root, ignoring any configuration defined in workspace members. Since the workspace is locked as a single unit, configuration is shared across all members. If a `pyproject.toml` file is found, uv will read configuration from the `[tool.uv]` table. For example, to set a persistent index URL, add the following to a `pyproject.toml`: ```toml title="pyproject.toml" [[tool.uv.index]] url = "https://test.pypi.org/simple" default = true ``` (If there is no such table, the `pyproject.toml` file will be ignored, and uv will continue searching in the directory hierarchy.) uv will also search for `uv.toml` files, which follow an identical structure, but omit the `[tool.uv]` prefix. For example: ```toml title="uv.toml" [[index]] url = "https://test.pypi.org/simple" default = true ``` !!! note `uv.toml` files take precedence over `pyproject.toml` files, so if both `uv.toml` and `pyproject.toml` files are present in a directory, configuration will be read from `uv.toml`, and `[tool.uv]` section in the accompanying `pyproject.toml` will be ignored. uv will also discover user-level configuration at `~/.config/uv/uv.toml` (or `$XDG_CONFIG_HOME/uv/uv.toml`) on macOS and Linux, or `%APPDATA%\uv\uv.toml` on Windows; and system-level configuration at `/etc/uv/uv.toml` (or `$XDG_CONFIG_DIRS/uv/uv.toml`) on macOS and Linux, or `%SYSTEMDRIVE%\ProgramData\uv\uv.toml` on Windows. User-and system-level configuration must use the `uv.toml` format, rather than the `pyproject.toml` format, as a `pyproject.toml` is intended to define a Python _project_. If project-, user-, and system-level configuration files are found, the settings will be merged, with project-level configuration taking precedence over the user-level configuration, and user-level configuration taking precedence over the system-level configuration. (If multiple system-level configuration files are found, e.g., at both `/etc/uv/uv.toml` and `$XDG_CONFIG_DIRS/uv/uv.toml`, only the first-discovered file will be used, with XDG taking priority.) For example, if a string, number, or boolean is present in both the project- and user-level configuration tables, the project-level value will be used, and the user-level value will be ignored. If an array is present in both tables, the arrays will be concatenated, with the project-level settings appearing earlier in the merged array. Settings provided via environment variables take precedence over persistent configuration, and settings provided via the command line take precedence over both. uv accepts a `--no-config` command-line argument which, when provided, disables the discovery of any persistent configuration. uv also accepts a `--config-file` command-line argument, which accepts a path to a `uv.toml` to use as the configuration file. When provided, this file will be used in place of _any_ discovered configuration files (e.g., user-level configuration will be ignored). ## Settings See the [settings reference](../reference/settings.md) for an enumeration of the available settings. ## `.env` `uv run` can load environment variables from dotenv files (e.g., `.env`, `.env.local`, `.env.development`), powered by the [`dotenvy`](https://github.com/allan2/dotenvy) crate. To load a `.env` file from a dedicated location, set the `UV_ENV_FILE` environment variable, or pass the `--env-file` flag to `uv run`. For example, to load environment variables from a `.env` file in the current working directory: ```console $ echo "MY_VAR='Hello, world!'" > .env $ uv run --env-file .env -- python -c 'import os; print(os.getenv("MY_VAR"))' Hello, world! ``` The `--env-file` flag can be provided multiple times, with subsequent files overriding values defined in previous files. To provide multiple files via the `UV_ENV_FILE` environment variable, separate the paths with a space (e.g., `UV_ENV_FILE="/path/to/file1 /path/to/file2"`). To disable dotenv loading (e.g., to override `UV_ENV_FILE` or the `--env-file` command-line argument), set the `UV_NO_ENV_FILE` environment variable to `1`, or pass the`--no-env-file` flag to `uv run`. If the same variable is defined in the environment and in a `.env` file, the value from the environment will take precedence. ## Configuring the pip interface A dedicated [`[tool.uv.pip]`](../reference/settings.md#pip) section is provided for configuring _just_ the `uv pip` command line interface. Settings in this section will not apply to `uv` commands outside the `uv pip` namespace. However, many of the settings in this section have corollaries in the top-level namespace which _do_ apply to the `uv pip` interface unless they are overridden by a value in the `uv.pip` section. The `uv.pip` settings are designed to adhere closely to pip's interface and are declared separately to retain compatibility while allowing the global settings to use alternate designs (e.g., `--no-build`). As an example, setting the `index-url` under `[tool.uv.pip]`, as in the following `pyproject.toml`, would only affect the `uv pip` subcommands (e.g., `uv pip install`, but not `uv sync`, `uv lock`, or `uv run`): ```toml title="pyproject.toml" [tool.uv.pip] index-url = "https://test.pypi.org/simple" ``` # Concepts overview Read the concept documents to learn more about uv's features: - [Projects](./projects/index.md) - [Tools](./tools.md) - [Python versions](./python-versions.md) - [Configuration files](./configuration-files.md) - [Package indexes](./indexes.md) - [Resolution](./resolution.md) - [The uv build backend](./build-backend.md) - [Authentication](./authentication.md) - [Caching](./cache.md) - [The pip interface](../pip/index.md) Looking for a quick introduction to features? See the [guides](../guides/index.md) instead. # Package indexes By default, uv uses the [Python Package Index (PyPI)](https://pypi.org) for dependency resolution and package installation. However, uv can be configured to use other package indexes, including private indexes, via the `[[tool.uv.index]]` configuration option (and `--index`, the analogous command-line option). ## Defining an index To include an additional index when resolving dependencies, add a `[[tool.uv.index]]` entry to your `pyproject.toml`: ```toml [[tool.uv.index]] # Optional name for the index. name = "pytorch" # Required URL for the index. url = "https://download.pytorch.org/whl/cpu" ``` Indexes are prioritized in the order in which they’re defined, such that the first index listed in the configuration file is the first index consulted when resolving dependencies, with indexes provided via the command line taking precedence over those in the configuration file. By default, uv includes the Python Package Index (PyPI) as the "default" index, i.e., the index used when a package is not found on any other index. To exclude PyPI from the list of indexes, set `default = true` on another index entry (or use the `--default-index` command-line option): ```toml [[tool.uv.index]] name = "pytorch" url = "https://download.pytorch.org/whl/cpu" default = true ``` The default index is always treated as lowest priority, regardless of its position in the list of indexes. Index names may only contain alphanumeric characters, dashes, underscores, and periods, and must be valid ASCII. When providing an index on the command line (with `--index` or `--default-index`) or through an environment variable (`UV_INDEX` or `UV_DEFAULT_INDEX`), names are optional but can be included using the `=` syntax, as in: ```shell # On the command line. $ uv lock --index pytorch=https://download.pytorch.org/whl/cpu # Via an environment variable. $ UV_INDEX=pytorch=https://download.pytorch.org/whl/cpu uv lock ``` ## Pinning a package to an index A package can be pinned to a specific index by specifying the index in its `tool.uv.sources` entry. For example, to ensure that `torch` is _always_ installed from the `pytorch` index, add the following to your `pyproject.toml`: ```toml [tool.uv.sources] torch = { index = "pytorch" } [[tool.uv.index]] name = "pytorch" url = "https://download.pytorch.org/whl/cpu" ``` Similarly, to pull from a different index based on the platform, you can provide a list of sources disambiguated by environment markers: ```toml title="pyproject.toml" [project] dependencies = ["torch"] [tool.uv.sources] torch = [ { index = "pytorch-cu118", marker = "sys_platform == 'darwin'"}, { index = "pytorch-cu124", marker = "sys_platform != 'darwin'"}, ] [[tool.uv.index]] name = "pytorch-cu118" url = "https://download.pytorch.org/whl/cu118" [[tool.uv.index]] name = "pytorch-cu124" url = "https://download.pytorch.org/whl/cu124" ``` An index can be marked as `explicit = true` to prevent packages from being installed from that index unless explicitly pinned to it. For example, to ensure that `torch` is installed from the `pytorch` index, but all other packages are installed from PyPI, add the following to your `pyproject.toml`: ```toml [tool.uv.sources] torch = { index = "pytorch" } [[tool.uv.index]] name = "pytorch" url = "https://download.pytorch.org/whl/cpu" explicit = true ``` Named indexes referenced via `tool.uv.sources` must be defined within the project's `pyproject.toml` file; indexes provided via the command-line, environment variables, or user-level configuration will not be recognized. If an index is marked as both `default = true` and `explicit = true`, it will be treated as an explicit index (i.e., only usable via `tool.uv.sources`) while also removing PyPI as the default index. ## Searching across multiple indexes By default, uv will stop at the first index on which a given package is available, and limit resolutions to those present on that first index (`first-index`). For example, if an internal index is specified via `[[tool.uv.index]]`, uv's behavior is such that if a package exists on that internal index, it will _always_ be installed from that internal index, and never from PyPI. The intent is to prevent "dependency confusion" attacks, in which an attacker publishes a malicious package on PyPI with the same name as an internal package, thus causing the malicious package to be installed instead of the internal package. See, for example, [the `torchtriton` attack](https://pytorch.org/blog/compromised-nightly-dependency/) from December 2022. To opt in to alternate index behaviors, use the`--index-strategy` command-line option, or the `UV_INDEX_STRATEGY` environment variable, which supports the following values: - `first-index` (default): Search for each package across all indexes, limiting the candidate versions to those present in the first index that contains the package. - `unsafe-first-match`: Search for each package across all indexes, but prefer the first index with a compatible version, even if newer versions are available on other indexes. - `unsafe-best-match`: Search for each package across all indexes, and select the best version from the combined set of candidate versions. While `unsafe-best-match` is the closest to pip's behavior, it exposes users to the risk of "dependency confusion" attacks. ## Authentication Most private package indexes require authentication to access packages, typically via a username and password (or access token). !!! tip See the [alternative index guide](../guides/integration/alternative-indexes.md) for details on authenticating with specific private index providers, e.g., from AWS, Azure, or GCP. ### Providing credentials directly Credentials can be provided directly via environment variables or by embedding them in the URL. For example, given an index named `internal-proxy` that requires a username (`public`) and password (`koala`), define the index (without credentials) in your `pyproject.toml`: ```toml [[tool.uv.index]] name = "internal-proxy" url = "https://example.com/simple" ``` From there, you can set the `UV_INDEX_INTERNAL_PROXY_USERNAME` and `UV_INDEX_INTERNAL_PROXY_PASSWORD` environment variables, where `INTERNAL_PROXY` is the uppercase version of the index name, with non-alphanumeric characters replaced by underscores: ```sh export UV_INDEX_INTERNAL_PROXY_USERNAME=public export UV_INDEX_INTERNAL_PROXY_PASSWORD=koala ``` By providing credentials via environment variables, you can avoid storing sensitive information in the plaintext `pyproject.toml` file. Alternatively, credentials can be embedded directly in the index definition: ```toml [[tool.uv.index]] name = "internal" url = "https://public:koala@pypi-proxy.corp.dev/simple" ``` For security purposes, credentials are _never_ stored in the `uv.lock` file; as such, uv _must_ have access to the authenticated URL at installation time. ### Using credential providers In addition to providing credentials directly, uv supports discovery of credentials from netrc and keyring. See the [HTTP authentication](./authentication.md#http-authentication) documentation for details on setting up specific credential providers. By default, uv will attempt an unauthenticated request before querying providers. If the request fails, uv will search for credentials. If credentials are found, an authenticated request will be attempted. !!! note If a username is set, uv will search for credentials before making an unauthenticated request. Some indexes (e.g., GitLab) will forward unauthenticated requests to a public index, like PyPI — which means that uv will not search for credentials. This behavior can be changed per-index, using the `authenticate` setting. For example, to always search for credentials: ```toml hl_lines="4" [[tool.uv.index]] name = "example" url = "https://example.com/simple" authenticate = "always" ``` When `authenticate` is set to `always`, uv will eagerly search for credentials and error if credentials cannot be found. ### Ignoring error codes when searching across indexes When using the [first-index strategy](#searching-across-multiple-indexes), uv will stop searching across indexes if an HTTP 401 Unauthorized or HTTP 403 Forbidden status code is encountered. The one exception is that uv will ignore 403s when searching the `pytorch` index (since this index returns a 403 when a package is not present). To configure which error codes are ignored for an index, use the `ignored-error-codes` setting. For example, to ignore 403s (but not 401s) for a private index: ```toml [[tool.uv.index]] name = "private-index" url = "https://private-index.com/simple" authenticate = "always" ignore-error-codes = [403] ``` uv will always continue searching across indexes when it encounters a `404 Not Found`. This cannot be overridden. ### Disabling authentication To prevent leaking credentials, authentication can be disabled for an index: ```toml hl_lines="4" [[tool.uv.index]] name = "example" url = "https://example.com/simple" authenticate = "never" ``` When `authenticate` is set to `never`, uv will never search for credentials for the given index and will error if credentials are provided directly. ### Customizing cache control headers By default, uv will respect the cache control headers provided by the index. For example, PyPI serves package metadata with a `max-age=600` header, thereby allowing uv to cache package metadata for 10 minutes; and wheels and source distributions with a `max-age=365000000, immutable` header, thereby allowing uv to cache artifacts indefinitely. To override the cache control headers for an index, use the `cache-control` setting: ```toml [[tool.uv.index]] name = "example" url = "https://example.com/simple" cache-control = { api = "max-age=600", files = "max-age=365000000, immutable" } ``` The `cache-control` setting accepts an object with two optional keys: - `api`: Controls caching for Simple API requests (package metadata). - `files`: Controls caching for artifact downloads (wheels and source distributions). The values for these keys are strings that follow the [HTTP Cache-Control](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Cache-Control) syntax. For example, to force uv to always revalidate package metadata, set `api = "no-cache"`: ```toml [[tool.uv.index]] name = "example" url = "https://example.com/simple" cache-control = { api = "no-cache" } ``` This setting is most commonly used to override the default cache control headers for private indexes that otherwise disable caching, often unintentionally. We typically recommend following PyPI's approach to caching headers, i.e., setting `api = "max-age=600"` and `files = "max-age=365000000, immutable"`. ## "Flat" indexes By default, `[[tool.uv.index]]` entries are assumed to be PyPI-style registries that implement the [PEP 503](https://peps.python.org/pep-0503/) Simple Repository API. However, uv also supports "flat" indexes, which are local directories or HTML pages that contain flat lists of wheels and source distributions. In pip, such indexes are specified using the `--find-links` option. To define a flat index in your `pyproject.toml`, use the `format = "flat"` option: ```toml [[tool.uv.index]] name = "example" url = "/path/to/directory" format = "flat" ``` Flat indexes support the same feature set as Simple Repository API indexes (e.g., `explicit = true`); you can also pin a package to a flat index using `tool.uv.sources`. ## `--index-url` and `--extra-index-url` In addition to the `[[tool.uv.index]]` configuration option, uv supports pip-style `--index-url` and `--extra-index-url` command-line options for compatibility, where `--index-url` defines the default index and `--extra-index-url` defines additional indexes. These options can be used in conjunction with the `[[tool.uv.index]]` configuration option, and follow the same prioritization rules: - The default index is always treated as lowest priority, whether defined via the legacy `--index-url` argument, the recommended `--default-index` argument, or a `[[tool.uv.index]]` entry with `default = true`. - Indexes are consulted in the order in which they’re defined, either via the legacy `--extra-index-url` argument, the recommended `--index` argument, or `[[tool.uv.index]]` entries. In effect, `--index-url` and `--extra-index-url` can be thought of as unnamed `[[tool.uv.index]]` entries, with `default = true` enabled for the former. In that context, `--index-url` maps to `--default-index`, and `--extra-index-url` maps to `--index`. # Preview features uv includes opt-in preview features to provide an opportunity for community feedback and increase confidence that changes are a net-benefit before enabling them for everyone. ## Enabling preview features To enable all preview features, use the `--preview` flag: ```console $ uv run --preview ... ``` Or, set the `UV_PREVIEW` environment variable: ```console $ UV_PREVIEW=1 uv run ... ``` To enable specific preview features, use the `--preview-features` flag: ```console $ uv run --preview-features foo ... ``` The `--preview-features` flag can be repeated to enable multiple features: ```console $ uv run --preview-features foo --preview-features bar ... ``` Or, features can be provided in a comma separated list: ```console $ uv run --preview-features foo,bar ... ``` The `UV_PREVIEW_FEATURES` environment variable can be used similarly, e.g.: ```console $ UV_PREVIEW_FEATURES=foo,bar uv run ... ``` For backwards compatibility, enabling preview features that do not exist will warn, but not error. ## Using preview features Often, preview features can be used without changing any preview settings if the behavior change is gated by some sort of user interaction, For example, while `pylock.toml` support is in preview, you can use `uv pip install` with a `pylock.toml` file without additional configuration because specifying the `pylock.toml` file indicates you want to use the feature. However, a warning will be displayed that the feature is in preview. The preview feature can be enabled to silence the warning. Other preview features change behavior without changes to your use of uv. For example, when the `python-upgrade` feature is enabled, the default behavior of `uv python install` changes to allow uv to upgrade Python versions transparently. This feature requires enabling the preview flag for proper usage. ## Available preview features The following preview features are available: - `add-bounds`: Allows configuring the [default bounds for `uv add`](../reference/settings.md#add-bounds) invocations. - `json-output`: Allows `--output-format json` for various uv commands. - `pylock`: Allows installing from `pylock.toml` files. - `python-install-default`: Allows [installing `python` and `python3` executables](./python-versions.md#installing-python-executables). - `python-upgrade`: Allows [transparent Python version upgrades](./python-versions.md#upgrading-python-versions). ## Disabling preview features The `--no-preview` option can be used to disable preview features. # Building distributions To distribute your project to others (e.g., to upload it to an index like PyPI), you'll need to build it into a distributable format. Python projects are typically distributed as both source distributions (sdists) and binary distributions (wheels). The former is typically a `.tar.gz` or `.zip` file containing the project's source code along with some additional metadata, while the latter is a `.whl` file containing pre-built artifacts that can be installed directly. !!! important When using `uv build`, uv acts as a [build frontend](https://peps.python.org/pep-0517/#terminology-and-goals) and only determines the Python version to use and invokes the build backend. The details of the builds, such as the included files and the distribution filenames, are determined by the build backend, as defined in [`[build-system]`](./config.md#build-systems). Information about build configuration can be found in the respective tool's documentation. ## Using `uv build` `uv build` can be used to build both source distributions and binary distributions for your project. By default, `uv build` will build the project in the current directory, and place the built artifacts in a `dist/` subdirectory: ```console $ uv build $ ls dist/ example-0.1.0-py3-none-any.whl example-0.1.0.tar.gz ``` You can build the project in a different directory by providing a path to `uv build`, e.g., `uv build path/to/project`. `uv build` will first build a source distribution, and then build a binary distribution (wheel) from that source distribution. You can limit `uv build` to building a source distribution with `uv build --sdist`, a binary distribution with `uv build --wheel`, or build both distributions from source with `uv build --sdist --wheel`. ## Build constraints `uv build` accepts `--build-constraint`, which can be used to constrain the versions of any build requirements during the build process. When coupled with `--require-hashes`, uv will enforce that the requirement used to build the project match specific, known hashes, for reproducibility. For example, given the following `constraints.txt`: ```text setuptools==68.2.2 --hash=sha256:b454a35605876da60632df1a60f736524eb73cc47bbc9f3f1ef1b644de74fd2a ``` Running the following would build the project with the specified version of `setuptools`, and verify that the downloaded `setuptools` distribution matches the specified hash: ```console $ uv build --build-constraint constraints.txt --require-hashes ``` # Configuring projects ## Python version requirement Projects may declare the Python versions supported by the project in the `project.requires-python` field of the `pyproject.toml`. It is recommended to set a `requires-python` value: ```toml title="pyproject.toml" hl_lines="4" [project] name = "example" version = "0.1.0" requires-python = ">=3.12" ``` The Python version requirement determines the Python syntax that is allowed in the project and affects selection of dependency versions (they must support the same Python version range). ## Entry points [Entry points](https://packaging.python.org/en/latest/specifications/entry-points/#entry-points) are the official term for an installed package to advertise interfaces. These include: - [Command line interfaces](#command-line-interfaces) - [Graphical user interfaces](#graphical-user-interfaces) - [Plugin entry points](#plugin-entry-points) !!! important Using the entry point tables requires a [build system](#build-systems) to be defined. ### Command-line interfaces Projects may define command line interfaces (CLIs) for the project in the `[project.scripts]` table of the `pyproject.toml`. For example, to declare a command called `hello` that invokes the `hello` function in the `example` module: ```toml title="pyproject.toml" [project.scripts] hello = "example:hello" ``` Then, the command can be run from a console: ```console $ uv run hello ``` ### Graphical user interfaces Projects may define graphical user interfaces (GUIs) for the project in the `[project.gui-scripts]` table of the `pyproject.toml`. !!! important These are only different from [command-line interfaces](#command-line-interfaces) on Windows, where they are wrapped by a GUI executable so they can be started without a console. On other platforms, they behave the same. For example, to declare a command called `hello` that invokes the `app` function in the `example` module: ```toml title="pyproject.toml" [project.gui-scripts] hello = "example:app" ``` ### Plugin entry points Projects may define entry points for plugin discovery in the [`[project.entry-points]`](https://packaging.python.org/en/latest/guides/creating-and-discovering-plugins/#using-package-metadata) table of the `pyproject.toml`. For example, to register the `example-plugin-a` package as a plugin for `example`: ```toml title="pyproject.toml" [project.entry-points.'example.plugins'] a = "example_plugin_a" ``` Then, in `example`, plugins would be loaded with: ```python title="example/__init__.py" from importlib.metadata import entry_points for plugin in entry_points(group='example.plugins'): plugin.load() ``` !!! note The `group` key can be an arbitrary value, it does not need to include the package name or "plugins". However, it is recommended to namespace the key by the package name to avoid collisions with other packages. ## Build systems A build system determines how the project should be packaged and installed. Projects may declare and configure a build system in the `[build-system]` table of the `pyproject.toml`. uv uses the presence of a build system to determine if a project contains a package that should be installed in the project virtual environment. If a build system is not defined, uv will not attempt to build or install the project itself, just its dependencies. If a build system is defined, uv will build and install the project into the project environment. The `--build-backend` option can be provided to `uv init` to create a packaged project with an appropriate layout. The `--package` option can be provided to `uv init` to create a packaged project with the default build system. !!! note While uv will not build and install the current project without a build system definition, the presence of a `[build-system]` table is not required in other packages. For legacy reasons, if a build system is not defined, then `setuptools.build_meta:__legacy__` is used to build the package. Packages you depend on may not explicitly declare their build system but are still installable. Similarly, if you [add a dependency on a local project](./dependencies.md#path) or install it with `uv pip`, uv will attempt to build and install it regardless of the presence of a `[build-system]` table. ### Build system options Build systems are used to power the following features: - Including or excluding files from distributions - Editable installation behavior - Dynamic project metadata - Compilation of native code - Vendoring shared libraries To configure these features, refer to the documentation of your chosen build system. ## Project packaging As discussed in [build systems](#build-systems), a Python project must be built to be installed. This process is generally referred to as "packaging". You probably need a package if you want to: - Add commands to the project - Distribute the project to others - Use a `src` and `test` layout - Write a library You probably _do not_ need a package if you are: - Writing scripts - Building a simple application - Using a flat layout While uv usually uses the declaration of a [build system](#build-systems) to determine if a project should be packaged, uv also allows overriding this behavior with the [`tool.uv.package`](../../reference/settings.md#package) setting. Setting `tool.uv.package = true` will force a project to be built and installed into the project environment. If no build system is defined, uv will use the setuptools legacy backend. Setting `tool.uv.package = false` will force a project package _not_ to be built and installed into the project environment. uv will ignore a declared build system when interacting with the project; however, uv will still respect explicit attempts to build the project such as invoking `uv build`. ## Project environment path The `UV_PROJECT_ENVIRONMENT` environment variable can be used to configure the project virtual environment path (`.venv` by default). If a relative path is provided, it will be resolved relative to the workspace root. If an absolute path is provided, it will be used as-is, i.e., a child directory will not be created for the environment. If an environment is not present at the provided path, uv will create it. This option can be used to write to the system Python environment, though it is not recommended. `uv sync` will remove extraneous packages from the environment by default and, as such, may leave the system in a broken state. To target the system environment, set `UV_PROJECT_ENVIRONMENT` to the prefix of the Python installation. For example, on Debian-based systems, this is usually `/usr/local`: ```console $ python -c "import sysconfig; print(sysconfig.get_config_var('prefix'))" /usr/local ``` To target this environment, you'd export `UV_PROJECT_ENVIRONMENT=/usr/local`. !!! important If an absolute path is provided and the setting is used across multiple projects, the environment will be overwritten by invocations in each project. This setting is only recommended for use for a single project in CI or Docker images. !!! note By default, uv does not read the `VIRTUAL_ENV` environment variable during project operations. A warning will be displayed if `VIRTUAL_ENV` is set to a different path than the project's environment. The `--active` flag can be used to opt-in to respecting `VIRTUAL_ENV`. The `--no-active` flag can be used to silence the warning. ## Build isolation By default, uv builds all packages in isolated virtual environments, as per [PEP 517](https://peps.python.org/pep-0517/). Some packages are incompatible with build isolation, be it intentionally (e.g., due to the use of heavy build dependencies, mostly commonly PyTorch) or unintentionally (e.g., due to the use of legacy packaging setups). To disable build isolation for a specific dependency, add it to the `no-build-isolation-package` list in your `pyproject.toml`: ```toml title="pyproject.toml" [project] name = "project" version = "0.1.0" description = "..." readme = "README.md" requires-python = ">=3.12" dependencies = ["cchardet"] [tool.uv] no-build-isolation-package = ["cchardet"] ``` Installing packages without build isolation requires that the package's build dependencies are installed in the project environment _prior_ to installing the package itself. This can be achieved by separating out the build dependencies and the packages that require them into distinct extras. For example: ```toml title="pyproject.toml" [project] name = "project" version = "0.1.0" description = "..." readme = "README.md" requires-python = ">=3.12" dependencies = [] [project.optional-dependencies] build = ["setuptools", "cython"] compile = ["cchardet"] [tool.uv] no-build-isolation-package = ["cchardet"] ``` Given the above, a user would first sync the `build` dependencies: ```console $ uv sync --extra build + cython==3.0.11 + foo==0.1.0 (from file:///Users/crmarsh/workspace/uv/foo) + setuptools==73.0.1 ``` Followed by the `compile` dependencies: ```console $ uv sync --extra compile + cchardet==2.1.7 - cython==3.0.11 - setuptools==73.0.1 ``` Note that `uv sync --extra compile` would, by default, uninstall the `cython` and `setuptools` packages. To instead retain the build dependencies, include both extras in the second `uv sync` invocation: ```console $ uv sync --extra build $ uv sync --extra build --extra compile ``` Some packages, like `cchardet` above, only require build dependencies for the _installation_ phase of `uv sync`. Others, like `flash-attn`, require their build dependencies to be present even just to resolve the project's lockfile during the _resolution_ phase. In such cases, the build dependencies must be installed prior to running any `uv lock` or `uv sync` commands, using the lower lower-level `uv pip` API. For example, given: ```toml title="pyproject.toml" [project] name = "project" version = "0.1.0" description = "..." readme = "README.md" requires-python = ">=3.12" dependencies = ["flash-attn"] [tool.uv] no-build-isolation-package = ["flash-attn"] ``` You could run the following sequence of commands to sync `flash-attn`: ```console $ uv venv $ uv pip install torch setuptools $ uv sync ``` Alternatively, you can provide the `flash-attn` metadata upfront via the [`dependency-metadata`](../../reference/settings.md#dependency-metadata) setting, thereby forgoing the need to build the package during the dependency resolution phase. For example, to provide the `flash-attn` metadata upfront, include the following in your `pyproject.toml`: ```toml title="pyproject.toml" [[tool.uv.dependency-metadata]] name = "flash-attn" version = "2.6.3" requires-dist = ["torch", "einops"] ``` !!! tip To determine the package metadata for a package like `flash-attn`, navigate to the appropriate Git repository, or look it up on [PyPI](https://pypi.org/project/flash-attn) and download the package's source distribution. The package requirements can typically be found in the `setup.py` or `setup.cfg` file. (If the package includes a built distribution, you can unzip it to find the `METADATA` file; however, the presence of a built distribution would negate the need to provide the metadata upfront, since it would already be available to uv.) Once included, you can again use the two-step `uv sync` process to install the build dependencies. Given the following `pyproject.toml`: ```toml title="pyproject.toml" [project] name = "project" version = "0.1.0" description = "..." readme = "README.md" requires-python = ">=3.12" dependencies = [] [project.optional-dependencies] build = ["torch", "setuptools", "packaging"] compile = ["flash-attn"] [tool.uv] no-build-isolation-package = ["flash-attn"] [[tool.uv.dependency-metadata]] name = "flash-attn" version = "2.6.3" requires-dist = ["torch", "einops"] ``` You could run the following sequence of commands to sync `flash-attn`: ```console $ uv sync --extra build $ uv sync --extra build --extra compile ``` !!! note The `version` field in `tool.uv.dependency-metadata` is optional for registry-based dependencies (when omitted, uv will assume the metadata applies to all versions of the package), but _required_ for direct URL dependencies (like Git dependencies). ## Editable mode By default, the project will be installed in editable mode, such that changes to the source code are immediately reflected in the environment. `uv sync` and `uv run` both accept a `--no-editable` flag, which instructs uv to install the project in non-editable mode. `--no-editable` is intended for deployment use-cases, such as building a Docker container, in which the project should be included in the deployed environment without a dependency on the originating source code. ## Conflicting dependencies uv resolves all project dependencies together, including optional dependencies ("extras") and dependency groups. If dependencies declared in one section are not compatible with those in another section, uv will fail to resolve the requirements of the project with an error. uv supports explicit declaration of conflicting dependency groups. For example, to declare that the `optional-dependency` groups `extra1` and `extra2` are incompatible: ```toml title="pyproject.toml" [tool.uv] conflicts = [ [ { extra = "extra1" }, { extra = "extra2" }, ], ] ``` Or, to declare the development dependency groups `group1` and `group2` incompatible: ```toml title="pyproject.toml" [tool.uv] conflicts = [ [ { group = "group1" }, { group = "group2" }, ], ] ``` See the [resolution documentation](../resolution.md#conflicting-dependencies) for more. ## Limited resolution environments If your project supports a more limited set of platforms or Python versions, you can constrain the set of solved platforms via the `environments` setting, which accepts a list of PEP 508 environment markers. For example, to constrain the lockfile to macOS and Linux, and exclude Windows: ```toml title="pyproject.toml" [tool.uv] environments = [ "sys_platform == 'darwin'", "sys_platform == 'linux'", ] ``` See the [resolution documentation](../resolution.md#limited-resolution-environments) for more. ## Required environments If your project _must_ support a specific platform or Python version, you can mark that platform as required via the `required-environments` setting. For example, to require that the project supports Intel macOS: ```toml title="pyproject.toml" [tool.uv] required-environments = [ "sys_platform == 'darwin' and platform_machine == 'x86_64'", ] ``` The `required-environments` setting is only relevant for packages that do not publish a source distribution (like PyTorch), as such packages can _only_ be installed on environments covered by the set of pre-built binary distributions (wheels) published by that package. See the [resolution documentation](../resolution.md#required-environments) for more. # Managing dependencies ## Dependency fields Dependencies of the project are defined in several fields: - [`project.dependencies`](#project-dependencies): Published dependencies. - [`project.optional-dependencies`](#optional-dependencies): Published optional dependencies, or "extras". - [`dependency-groups`](#dependency-groups): Local dependencies for development. - [`tool.uv.sources`](#dependency-sources): Alternative sources for dependencies during development. !!! note The `project.dependencies` and `project.optional-dependencies` fields can be used even if project isn't going to be published. `dependency-groups` are a recently standardized feature and may not be supported by all tools yet. uv supports modifying the project's dependencies with `uv add` and `uv remove`, but dependency metadata can also be updated by editing the `pyproject.toml` directly. ## Adding dependencies To add a dependency: ```console $ uv add httpx ``` An entry will be added in the `project.dependencies` field: ```toml title="pyproject.toml" hl_lines="4" [project] name = "example" version = "0.1.0" dependencies = ["httpx>=0.27.2"] ``` The [`--dev`](#development-dependencies), [`--group`](#dependency-groups), or [`--optional`](#optional-dependencies) flags can be used to add dependencies to an alternative field. The dependency will include a constraint, e.g., `>=0.27.2`, for the most recent, compatible version of the package. The kind of bound can be adjusted with [`--bounds`](../../reference/settings.md#add-bounds), or the constraint can be provided directly: ```console $ uv add "httpx>=0.20" ``` When adding a dependency from a source other than a package registry, uv will add an entry in the sources field. For example, when adding `httpx` from GitHub: ```console $ uv add "httpx @ git+https://github.com/encode/httpx" ``` The `pyproject.toml` will include a [Git source entry](#git): ```toml title="pyproject.toml" hl_lines="8-9" [project] name = "example" version = "0.1.0" dependencies = [ "httpx", ] [tool.uv.sources] httpx = { git = "https://github.com/encode/httpx" } ``` If a dependency cannot be used, uv will display an error.: ```console $ uv add "httpx>9999" × No solution found when resolving dependencies: ╰─▶ Because only httpx<=1.0.0b0 is available and your project depends on httpx>9999, we can conclude that your project's requirements are unsatisfiable. ``` ### Importing dependencies Dependencies declared in a `requirements.txt` file can be added to the project with the `-r` option: ``` uv add -r requirements.txt ``` ## Removing dependencies To remove a dependency: ```console $ uv remove httpx ``` The `--dev`, `--group`, or `--optional` flags can be used to remove a dependency from a specific table. If a [source](#dependency-sources) is defined for the removed dependency, and there are no other references to the dependency, it will also be removed. ## Changing dependencies To change an existing dependency, e.g., to use a different constraint for `httpx`: ```console $ uv add "httpx>0.1.0" ``` !!! note In this example, we are changing the constraints for the dependency in the `pyproject.toml`. The locked version of the dependency will only change if necessary to satisfy the new constraints. To force the package version to update to the latest within the constraints, use `--upgrade-package `, e.g.: ```console $ uv add "httpx>0.1.0" --upgrade-package httpx ``` See the [lockfile](./sync.md#upgrading-locked-package-versions) documentation for more details on upgrading packages. Requesting a different dependency source will update the `tool.uv.sources` table, e.g., to use `httpx` from a local path during development: ```console $ uv add "httpx @ ../httpx" ``` ## Platform-specific dependencies To ensure that a dependency is only installed on a specific platform or on specific Python versions, use [environment markers](https://peps.python.org/pep-0508/#environment-markers). For example, to install `jax` on Linux, but not on Windows or macOS: ```console $ uv add "jax; sys_platform == 'linux'" ``` The resulting `pyproject.toml` will then include the environment marker in the dependency definition: ```toml title="pyproject.toml" hl_lines="6" [project] name = "project" version = "0.1.0" requires-python = ">=3.11" dependencies = ["jax; sys_platform == 'linux'"] ``` Similarly, to include `numpy` on Python 3.11 and later: ```console $ uv add "numpy; python_version >= '3.11'" ``` See Python's [environment marker](https://peps.python.org/pep-0508/#environment-markers) documentation for a complete enumeration of the available markers and operators. !!! tip Dependency sources can also be [changed per-platform](#platform-specific-sources). ## Project dependencies The `project.dependencies` table represents the dependencies that are used when uploading to PyPI or building a wheel. Individual dependencies are specified using [dependency specifiers](https://packaging.python.org/en/latest/specifications/dependency-specifiers/) syntax, and the table follows the [PEP 621](https://packaging.python.org/en/latest/specifications/pyproject-toml/) standard. `project.dependencies` defines the list of packages that are required for the project, along with the version constraints that should be used when installing them. Each entry includes a dependency name and version. An entry may include extras or environment markers for platform-specific packages. For example: ```toml title="pyproject.toml" [project] name = "albatross" version = "0.1.0" dependencies = [ # Any version in this range "tqdm >=4.66.2,<5", # Exactly this version of torch "torch ==2.2.2", # Install transformers with the torch extra "transformers[torch] >=4.39.3,<5", # Only install this package on older python versions # See "Environment Markers" for more information "importlib_metadata >=7.1.0,<8; python_version < '3.10'", "mollymawk ==0.1.0" ] ``` ## Dependency sources The `tool.uv.sources` table extends the standard dependency tables with alternative dependency sources, which are used during development. Dependency sources add support for common patterns that are not supported by the `project.dependencies` standard, like editable installations and relative paths. For example, to install `foo` from a directory relative to the project root: ```toml title="pyproject.toml" hl_lines="7" [project] name = "example" version = "0.1.0" dependencies = ["foo"] [tool.uv.sources] foo = { path = "./packages/foo" } ``` The following dependency sources are supported by uv: - [Index](#index): A package resolved from a specific package index. - [Git](#git): A Git repository. - [URL](#url): A remote wheel or source distribution. - [Path](#path): A local wheel, source distribution, or project directory. - [Workspace](#workspace-member): A member of the current workspace. !!! important Sources are only respected by uv. If another tool is used, only the definitions in the standard project tables will be used. If another tool is being used for development, any metadata provided in the source table will need to be re-specified in the other tool's format. ### Index To add Python package from a specific index, use the `--index` option: ```console $ uv add torch --index pytorch=https://download.pytorch.org/whl/cpu ``` uv will store the index in `[[tool.uv.index]]` and add a `[tool.uv.sources]` entry: ```toml title="pyproject.toml" [project] dependencies = ["torch"] [tool.uv.sources] torch = { index = "pytorch" } [[tool.uv.index]] name = "pytorch" url = "https://download.pytorch.org/whl/cpu" ``` !!! tip The above example will only work on x86-64 Linux, due to the specifics of the PyTorch index. See the [PyTorch guide](../../guides/integration/pytorch.md) for more information about setting up PyTorch. Using an `index` source _pins_ a package to the given index — it will not be downloaded from other indexes. When defining an index, an `explicit` flag can be included to indicate that the index should _only_ be used for packages that explicitly specify it in `tool.uv.sources`. If `explicit` is not set, other packages may be resolved from the index, if not found elsewhere. ```toml title="pyproject.toml" hl_lines="4" [[tool.uv.index]] name = "pytorch" url = "https://download.pytorch.org/whl/cpu" explicit = true ``` ### Git To add a Git dependency source, prefix a Git-compatible URL with `git+`. For example: ```console $ # Install over HTTP(S). $ uv add git+https://github.com/encode/httpx $ # Install over SSH. $ uv add git+ssh://git@github.com/encode/httpx ``` ```toml title="pyproject.toml" hl_lines="5" [project] dependencies = ["httpx"] [tool.uv.sources] httpx = { git = "https://github.com/encode/httpx" } ``` Specific Git references can be requested, e.g., a tag: ```console $ uv add git+https://github.com/encode/httpx --tag 0.27.0 ``` ```toml title="pyproject.toml" hl_lines="7" [project] dependencies = ["httpx"] [tool.uv.sources] httpx = { git = "https://github.com/encode/httpx", tag = "0.27.0" } ``` Or, a branch: ```console $ uv add git+https://github.com/encode/httpx --branch main ``` ```toml title="pyproject.toml" hl_lines="7" [project] dependencies = ["httpx"] [tool.uv.sources] httpx = { git = "https://github.com/encode/httpx", branch = "main" } ``` Or, a revision (commit): ```console $ uv add git+https://github.com/encode/httpx --rev 326b9431c761e1ef1e00b9f760d1f654c8db48c6 ``` ```toml title="pyproject.toml" hl_lines="7" [project] dependencies = ["httpx"] [tool.uv.sources] httpx = { git = "https://github.com/encode/httpx", rev = "326b9431c761e1ef1e00b9f760d1f654c8db48c6" } ``` A `subdirectory` may be specified if the package isn't in the repository root: ```console $ uv add git+https://github.com/langchain-ai/langchain#subdirectory=libs/langchain ``` ```toml title="pyproject.toml" [project] dependencies = ["langchain"] [tool.uv.sources] langchain = { git = "https://github.com/langchain-ai/langchain", subdirectory = "libs/langchain" } ``` ### URL To add a URL source, provide a `https://` URL to either a wheel (ending in `.whl`) or a source distribution (typically ending in `.tar.gz` or `.zip`; see [here](../../concepts/resolution.md#source-distribution) for all supported formats). For example: ```console $ uv add "https://files.pythonhosted.org/packages/5c/2d/3da5bdf4408b8b2800061c339f240c1802f2e82d55e50bd39c5a881f47f0/httpx-0.27.0.tar.gz" ``` Will result in a `pyproject.toml` with: ```toml title="pyproject.toml" hl_lines="5" [project] dependencies = ["httpx"] [tool.uv.sources] httpx = { url = "https://files.pythonhosted.org/packages/5c/2d/3da5bdf4408b8b2800061c339f240c1802f2e82d55e50bd39c5a881f47f0/httpx-0.27.0.tar.gz" } ``` URL dependencies can also be manually added or edited in the `pyproject.toml` with the `{ url = }` syntax. A `subdirectory` may be specified if the source distribution isn't in the archive root. ### Path To add a path source, provide the path of a wheel (ending in `.whl`), a source distribution (typically ending in `.tar.gz` or `.zip`; see [here](../../concepts/resolution.md#source-distribution) for all supported formats), or a directory containing a `pyproject.toml`. For example: ```console $ uv add /example/foo-0.1.0-py3-none-any.whl ``` Will result in a `pyproject.toml` with: ```toml title="pyproject.toml" [project] dependencies = ["foo"] [tool.uv.sources] foo = { path = "/example/foo-0.1.0-py3-none-any.whl" } ``` The path may also be a relative path: ```console $ uv add ./foo-0.1.0-py3-none-any.whl ``` Or, a path to a project directory: ```console $ uv add ~/projects/bar/ ``` !!! important When using a directory as a path dependency, uv will attempt to build and install the target as a package by default. See the [virtual dependency](#virtual-dependencies) documentation for details. An [editable installation](#editable-dependencies) is not used for path dependencies by default. An editable installation may be requested for project directories: ```console $ uv add --editable ../projects/bar/ ``` Which will result in a `pyproject.toml` with: ```toml title="pyproject.toml" [project] dependencies = ["bar"] [tool.uv.sources] bar = { path = "../projects/bar", editable = true } ``` !!! tip For multiple packages in the same repository, [_workspaces_](./workspaces.md) may be a better fit. ### Workspace member To declare a dependency on a workspace member, add the member name with `{ workspace = true }`. All workspace members must be explicitly stated. Workspace members are always [editable](#editable-dependencies) . See the [workspace](./workspaces.md) documentation for more details on workspaces. ```toml title="pyproject.toml" [project] dependencies = ["foo==0.1.0"] [tool.uv.sources] foo = { workspace = true } [tool.uv.workspace] members = [ "packages/foo" ] ``` ### Platform-specific sources You can limit a source to a given platform or Python version by providing [dependency specifiers](https://packaging.python.org/en/latest/specifications/dependency-specifiers/)-compatible environment markers for the source. For example, to pull `httpx` from GitHub, but only on macOS, use the following: ```toml title="pyproject.toml" hl_lines="8" [project] dependencies = ["httpx"] [tool.uv.sources] httpx = { git = "https://github.com/encode/httpx", tag = "0.27.2", marker = "sys_platform == 'darwin'" } ``` By specifying the marker on the source, uv will still include `httpx` on all platforms, but will download the source from GitHub on macOS, and fall back to PyPI on all other platforms. ### Multiple sources You can specify multiple sources for a single dependency by providing a list of sources, disambiguated by [PEP 508](https://peps.python.org/pep-0508/#environment-markers)-compatible environment markers. For example, to pull in different `httpx` tags on macOS vs. Linux: ```toml title="pyproject.toml" hl_lines="6-7" [project] dependencies = ["httpx"] [tool.uv.sources] httpx = [ { git = "https://github.com/encode/httpx", tag = "0.27.2", marker = "sys_platform == 'darwin'" }, { git = "https://github.com/encode/httpx", tag = "0.24.1", marker = "sys_platform == 'linux'" }, ] ``` This strategy extends to using different indexes based on environment markers. For example, to install `torch` from different PyTorch indexes based on the platform: ```toml title="pyproject.toml" hl_lines="6-7" [project] dependencies = ["torch"] [tool.uv.sources] torch = [ { index = "torch-cpu", marker = "platform_system == 'Darwin'"}, { index = "torch-gpu", marker = "platform_system == 'Linux'"}, ] [[tool.uv.index]] name = "torch-cpu" url = "https://download.pytorch.org/whl/cpu" explicit = true [[tool.uv.index]] name = "torch-gpu" url = "https://download.pytorch.org/whl/cu124" explicit = true ``` ### Disabling sources To instruct uv to ignore the `tool.uv.sources` table (e.g., to simulate resolving with the package's published metadata), use the `--no-sources` flag: ```console $ uv lock --no-sources ``` The use of `--no-sources` will also prevent uv from discovering any [workspace members](#workspace-member) that could satisfy a given dependency. ## Optional dependencies It is common for projects that are published as libraries to make some features optional to reduce the default dependency tree. For example, Pandas has an [`excel` extra](https://pandas.pydata.org/docs/getting_started/install.html#excel-files) and a [`plot` extra](https://pandas.pydata.org/docs/getting_started/install.html#visualization) to avoid installation of Excel parsers and `matplotlib` unless someone explicitly requires them. Extras are requested with the `package[]` syntax, e.g., `pandas[plot, excel]`. Optional dependencies are specified in `[project.optional-dependencies]`, a TOML table that maps from extra name to its dependencies, following [dependency specifiers](#dependency-specifiers-pep-508) syntax. Optional dependencies can have entries in `tool.uv.sources` the same as normal dependencies. ```toml title="pyproject.toml" [project] name = "pandas" version = "1.0.0" [project.optional-dependencies] plot = [ "matplotlib>=3.6.3" ] excel = [ "odfpy>=1.4.1", "openpyxl>=3.1.0", "python-calamine>=0.1.7", "pyxlsb>=1.0.10", "xlrd>=2.0.1", "xlsxwriter>=3.0.5" ] ``` To add an optional dependency, use the `--optional ` option: ```console $ uv add httpx --optional network ``` !!! note If you have optional dependencies that conflict with one another, resolution will fail unless you explicitly [declare them as conflicting](./config.md#conflicting-dependencies). Sources can also be declared as applying only to a specific optional dependency. For example, to pull `torch` from different PyTorch indexes based on an optional `cpu` or `gpu` extra: ```toml title="pyproject.toml" [project] dependencies = [] [project.optional-dependencies] cpu = [ "torch", ] gpu = [ "torch", ] [tool.uv.sources] torch = [ { index = "torch-cpu", extra = "cpu" }, { index = "torch-gpu", extra = "gpu" }, ] [[tool.uv.index]] name = "torch-cpu" url = "https://download.pytorch.org/whl/cpu" [[tool.uv.index]] name = "torch-gpu" url = "https://download.pytorch.org/whl/cu124" ``` ## Development dependencies Unlike optional dependencies, development dependencies are local-only and will _not_ be included in the project requirements when published to PyPI or other indexes. As such, development dependencies are not included in the `[project]` table. Development dependencies can have entries in `tool.uv.sources` the same as normal dependencies. To add a development dependency, use the `--dev` flag: ```console $ uv add --dev pytest ``` uv uses the `[dependency-groups]` table (as defined in [PEP 735](https://peps.python.org/pep-0735/)) for declaration of development dependencies. The above command will create a `dev` group: ```toml title="pyproject.toml" [dependency-groups] dev = [ "pytest >=8.1.1,<9" ] ``` The `dev` group is special-cased; there are `--dev`, `--only-dev`, and `--no-dev` flags to toggle inclusion or exclusion of its dependencies. See `--no-default-groups` to disable all default groups instead. Additionally, the `dev` group is [synced by default](#default-groups). ### Dependency groups Development dependencies can be divided into multiple groups, using the `--group` flag. For example, to add a development dependency in the `lint` group: ```console $ uv add --group lint ruff ``` Which results in the following `[dependency-groups]` definition: ```toml title="pyproject.toml" [dependency-groups] dev = [ "pytest" ] lint = [ "ruff" ] ``` Once groups are defined, the `--all-groups`, `--no-default-groups`, `--group`, `--only-group`, and `--no-group` options can be used to include or exclude their dependencies. !!! tip The `--dev`, `--only-dev`, and `--no-dev` flags are equivalent to `--group dev`, `--only-group dev`, and `--no-group dev` respectively. uv requires that all dependency groups are compatible with each other and resolves all groups together when creating the lockfile. If dependencies declared in one group are not compatible with those in another group, uv will fail to resolve the requirements of the project with an error. !!! note If you have dependency groups that conflict with one another, resolution will fail unless you explicitly [declare them as conflicting](./config.md#conflicting-dependencies). ### Nesting groups A dependency group can include other dependency groups, e.g.: ```toml title="pyproject.toml" [dependency-groups] dev = [ {include-group = "lint"}, {include-group = "test"} ] lint = [ "ruff" ] test = [ "pytest" ] ``` An included group's dependencies cannot conflict with the other dependencies declared in a group. ### Default groups By default, uv includes the `dev` dependency group in the environment (e.g., during `uv run` or `uv sync`). The default groups to include can be changed using the `tool.uv.default-groups` setting. ```toml title="pyproject.toml" [tool.uv] default-groups = ["dev", "foo"] ``` To enable all dependencies groups by default, use `"all"` instead of listing group names: ```toml title="pyproject.toml" [tool.uv] default-groups = "all" ``` !!! tip To disable this behaviour during `uv run` or `uv sync`, use `--no-default-groups`. To exclude a specific default group, use `--no-group `. ### Legacy `dev-dependencies` Before `[dependency-groups]` was standardized, uv used the `tool.uv.dev-dependencies` field to specify development dependencies, e.g.: ```toml title="pyproject.toml" [tool.uv] dev-dependencies = [ "pytest" ] ``` Dependencies declared in this section will be combined with the contents in the `dependency-groups.dev`. Eventually, the `dev-dependencies` field will be deprecated and removed. !!! note If a `tool.uv.dev-dependencies` field exists, `uv add --dev` will use the existing section instead of adding a new `dependency-groups.dev` section. ## Build dependencies If a project is structured as [Python package](./config.md#build-systems), it may declare dependencies that are required to build the project, but not required to run it. These dependencies are specified in the `[build-system]` table under `build-system.requires`, following [PEP 518](https://peps.python.org/pep-0518/). For example, if a project uses `setuptools` as its build backend, it should declare `setuptools` as a build dependency: ```toml title="pyproject.toml" [project] name = "pandas" version = "0.1.0" [build-system] requires = ["setuptools>=42"] build-backend = "setuptools.build_meta" ``` By default, uv will respect `tool.uv.sources` when resolving build dependencies. For example, to use a local version of `setuptools` for building, add the source to `tool.uv.sources`: ```toml title="pyproject.toml" [project] name = "pandas" version = "0.1.0" [build-system] requires = ["setuptools>=42"] build-backend = "setuptools.build_meta" [tool.uv.sources] setuptools = { path = "./packages/setuptools" } ``` When publishing a package, we recommend running `uv build --no-sources` to ensure that the package builds correctly when `tool.uv.sources` is disabled, as is the case when using other build tools, like [`pypa/build`](https://github.com/pypa/build). ## Editable dependencies A regular installation of a directory with a Python package first builds a wheel and then installs that wheel into your virtual environment, copying all source files. When the package source files are edited, the virtual environment will contain outdated versions. Editable installations solve this problem by adding a link to the project within the virtual environment (a `.pth` file), which instructs the interpreter to include the source files directly. There are some limitations to editables (mainly: the build backend needs to support them, and native modules aren't recompiled before import), but they are useful for development, as the virtual environment will always use the latest changes to the package. uv uses editable installation for workspace packages by default. To add an editable dependency, use the `--editable` flag: ```console $ uv add --editable ./path/foo ``` Or, to opt-out of using an editable dependency in a workspace: ```console $ uv add --no-editable ./path/foo ``` ## Virtual dependencies uv allows dependencies to be "virtual", in which the dependency itself is not installed as a [package](./config.md#project-packaging), but its dependencies are. By default, dependencies are never virtual. A dependency with a [`path` source](#path) can be virtual if it explicitly sets [`tool.uv.package = false`](../../reference/settings.md#package). Unlike working _in_ the dependent project with uv, the package will be built even if a [build system](./config.md#build-systems) is not declared. To treat a dependency as virtual, set `package = false` on the source: ```toml title="pyproject.toml" [project] dependencies = ["bar"] [tool.uv.sources] bar = { path = "../projects/bar", package = false } ``` If a dependency sets `tool.uv.package = false`, it can be overridden by declaring `package = true` on the source: ```toml title="pyproject.toml" [project] dependencies = ["bar"] [tool.uv.sources] bar = { path = "../projects/bar", package = true } ``` Similarly, a dependency with a [`workspace` source](#workspace-member) can be virtual if it explicitly sets [`tool.uv.package = false`](../../reference/settings.md#package). The workspace member will be built even if a [build system](./config.md#build-systems) is not declared. Workspace members that are _not_ dependencies can be virtual by default, e.g., if the parent `pyproject.toml` is: ```toml title="pyproject.toml" [project] name = "parent" version = "1.0.0" dependencies = [] [tool.uv.workspace] members = ["child"] ``` And the child `pyproject.toml` excluded a build system: ```toml title="pyproject.toml" [project] name = "child" version = "1.0.0" dependencies = ["anyio"] ``` Then the `child` workspace member would not be installed, but the transitive dependency `anyio` would be. In contrast, if the parent declared a dependency on `child`: ```toml title="pyproject.toml" [project] name = "parent" version = "1.0.0" dependencies = ["child"] [tool.uv.sources] child = { workspace = true } [tool.uv.workspace] members = ["child"] ``` Then `child` would be built and installed. ## Dependency specifiers uv uses standard [dependency specifiers](https://packaging.python.org/en/latest/specifications/dependency-specifiers/), originally defined in [PEP 508](https://peps.python.org/pep-0508/). A dependency specifier is composed of, in order: - The dependency name - The extras you want (optional) - The version specifier - An environment marker (optional) The version specifiers are comma separated and added together, e.g., `foo >=1.2.3,<2,!=1.4.0` is interpreted as "a version of `foo` that's at least 1.2.3, but less than 2, and not 1.4.0". Specifiers are padded with trailing zeros if required, so `foo ==2` matches foo 2.0.0, too. A star can be used for the last digit with equals, e.g., `foo ==2.1.*` will accept any release from the 2.1 series. Similarly, `~=` matches where the last digit is equal or higher, e.g., `foo ~=1.2` is equal to `foo >=1.2,<2`, and `foo ~=1.2.3` is equal to `foo >=1.2.3,<1.3`. Extras are comma-separated in square bracket between name and version, e.g., `pandas[excel,plot] ==2.2`. Whitespace between extra names is ignored. Some dependencies are only required in specific environments, e.g., a specific Python version or operating system. For example to install the `importlib-metadata` backport for the `importlib.metadata` module, use `importlib-metadata >=7.1.0,<8; python_version < '3.10'`. To install `colorama` on Windows (but omit it on other platforms), use `colorama >=0.4.6,<5; platform_system == "Windows"`. Markers are combined with `and`, `or`, and parentheses, e.g., `aiohttp >=3.7.4,<4; (sys_platform != 'win32' or implementation_name != 'pypy') and python_version >= '3.10'`. Note that versions within markers must be quoted, while versions _outside_ of markers must _not_ be quoted. # Projects Projects help manage Python code spanning multiple files. !!! tip Looking for an introduction to creating a project with uv? See the [projects guide](../../guides/projects.md) first. Working on projects is a core part of the uv experience. Learn more about using projects: - [Understanding project structure and files](./layout.md) - [Creating new projects](./init.md) - [Managing project dependencies](./dependencies.md) - [Running commands and scripts in a project](./run.md) - [Using lockfiles and syncing the environment](./sync.md) - [Configuring the project for advanced use cases](./config.md) - [Building distributions to publish a project](./build.md) - [Using workspaces to work on multiple projects at once](./workspaces.md) # Creating projects uv supports creating a project with `uv init`. When creating projects, uv supports two basic templates: [**applications**](#applications) and [**libraries**](#libraries). By default, uv will create a project for an application. The `--lib` flag can be used to create a project for a library instead. ## Target directory uv will create a project in the working directory, or, in a target directory by providing a name, e.g., `uv init foo`. If there's already a project in the target directory, i.e., if there's a `pyproject.toml`, uv will exit with an error. ## Applications Application projects are suitable for web servers, scripts, and command-line interfaces. Applications are the default target for `uv init`, but can also be specified with the `--app` flag. ```console $ uv init example-app ``` The project includes a `pyproject.toml`, a sample file (`main.py`), a readme, and a Python version pin file (`.python-version`). ```console $ tree example-app example-app ├── .python-version ├── README.md ├── main.py └── pyproject.toml ``` !!! note Prior to v0.6.0, uv created a file named `hello.py` instead of `main.py`. The `pyproject.toml` includes basic metadata. It does not include a build system, it is not a [package](./config.md#project-packaging) and will not be installed into the environment: ```toml title="pyproject.toml" [project] name = "example-app" version = "0.1.0" description = "Add your description here" readme = "README.md" requires-python = ">=3.11" dependencies = [] ``` The sample file defines a `main` function with some standard boilerplate: ```python title="main.py" def main(): print("Hello from example-app!") if __name__ == "__main__": main() ``` Python files can be executed with `uv run`: ```console $ cd example-app $ uv run main.py Hello from example-project! ``` ## Packaged applications Many use-cases require a [package](./config.md#project-packaging). For example, if you are creating a command-line interface that will be published to PyPI or if you want to define tests in a dedicated directory. The `--package` flag can be used to create a packaged application: ```console $ uv init --package example-pkg ``` The source code is moved into a `src` directory with a module directory and an `__init__.py` file: ```console $ tree example-pkg example-pkg ├── .python-version ├── README.md ├── pyproject.toml └── src └── example_pkg └── __init__.py ``` A [build system](./config.md#build-systems) is defined, so the project will be installed into the environment: ```toml title="pyproject.toml" hl_lines="12-14" [project] name = "example-pkg" version = "0.1.0" description = "Add your description here" readme = "README.md" requires-python = ">=3.11" dependencies = [] [project.scripts] example-pkg = "example_pkg:main" [build-system] requires = ["uv_build>=0.8.4,<0.9.0"] build-backend = "uv_build" ``` !!! tip The `--build-backend` option can be used to request an alternative build system. A [command](./config.md#entry-points) definition is included: ```toml title="pyproject.toml" hl_lines="9 10" [project] name = "example-pkg" version = "0.1.0" description = "Add your description here" readme = "README.md" requires-python = ">=3.11" dependencies = [] [project.scripts] example-pkg = "example_pkg:main" [build-system] requires = ["uv_build>=0.8.4,<0.9.0"] build-backend = "uv_build" ``` The command can be executed with `uv run`: ```console $ cd example-pkg $ uv run example-pkg Hello from example-pkg! ``` ## Libraries A library provides functions and objects for other projects to consume. Libraries are intended to be built and distributed, e.g., by uploading them to PyPI. Libraries can be created by using the `--lib` flag: ```console $ uv init --lib example-lib ``` !!! note Using `--lib` implies `--package`. Libraries always require a packaged project. As with a [packaged application](#packaged-applications), a `src` layout is used. A `py.typed` marker is included to indicate to consumers that types can be read from the library: ```console $ tree example-lib example-lib ├── .python-version ├── README.md ├── pyproject.toml └── src └── example_lib ├── py.typed └── __init__.py ``` !!! note A `src` layout is particularly valuable when developing libraries. It ensures that the library is isolated from any `python` invocations in the project root and that distributed library code is well separated from the rest of the project source. A [build system](./config.md#build-systems) is defined, so the project will be installed into the environment: ```toml title="pyproject.toml" hl_lines="12-14" [project] name = "example-lib" version = "0.1.0" description = "Add your description here" readme = "README.md" requires-python = ">=3.11" dependencies = [] [build-system] requires = ["uv_build>=0.8.4,<0.9.0"] build-backend = "uv_build" ``` !!! tip You can select a different build backend template by using `--build-backend` with `hatchling`, `uv_build`, `flit-core`, `pdm-backend`, `setuptools`, `maturin`, or `scikit-build-core`. An alternative backend is required if you want to create a [library with extension modules](#projects-with-extension-modules). The created module defines a simple API function: ```python title="__init__.py" def hello() -> str: return "Hello from example-lib!" ``` And you can import and execute it using `uv run`: ```console $ cd example-lib $ uv run python -c "import example_lib; print(example_lib.hello())" Hello from example-lib! ``` ## Projects with extension modules Most Python projects are "pure Python", meaning they do not define modules in other languages like C, C++, FORTRAN, or Rust. However, projects with extension modules are often used for performance sensitive code. Creating a project with an extension module requires choosing an alternative build system. uv supports creating projects with the following build systems that support building extension modules: - [`maturin`](https://www.maturin.rs) for projects with Rust - [`scikit-build-core`](https://github.com/scikit-build/scikit-build-core) for projects with C, C++, FORTRAN, Cython Specify the build system with the `--build-backend` flag: ```console $ uv init --build-backend maturin example-ext ``` !!! note Using `--build-backend` implies `--package`. The project contains a `Cargo.toml` and a `lib.rs` file in addition to the typical Python project files: ```console $ tree example-ext example-ext ├── .python-version ├── Cargo.toml ├── README.md ├── pyproject.toml └── src ├── lib.rs └── example_ext ├── __init__.py └── _core.pyi ``` !!! note If using `scikit-build-core`, you'll see CMake configuration and a `main.cpp` file instead. The Rust library defines a simple function: ```rust title="src/lib.rs" use pyo3::prelude::*; #[pyfunction] fn hello_from_bin() -> String { "Hello from example-ext!".to_string() } #[pymodule] fn _core(m: &Bound<'_, PyModule>) -> PyResult<()> { m.add_function(wrap_pyfunction!(hello_from_bin, m)?)?; Ok(()) } ``` And the Python module imports it: ```python title="src/example_ext/__init__.py" from example_ext._core import hello_from_bin def main() -> None: print(hello_from_bin()) ``` The command can be executed with `uv run`: ```console $ cd example-ext $ uv run example-ext Hello from example-ext! ``` !!! important Changes to the extension code in `lib.rs` or `main.cpp` will require running `--reinstall` to rebuild them. ## Creating a minimal project If you only want to create a `pyproject.toml`, use the `--bare` option: ```console $ uv init example --bare ``` uv will skip creating a Python version pin file, a README, and any source directories or files. Additionally, uv will not initialize a version control system (i.e., `git`). ```console $ tree example-bare example-bare └── pyproject.toml ``` uv will also not add extra metadata to the `pyproject.toml`, such as the `description` or `authors`. ```toml [project] name = "example" version = "0.1.0" requires-python = ">=3.12" dependencies = [] ``` The `--bare` option can be used with other options like `--lib` or `--build-backend` — in these cases uv will still configure a build system but will not create the expected file structure. When `--bare` is used, additional features can still be used opt-in: ```console $ uv init example --bare --description "Hello world" --author-from git --vcs git --python-pin ``` # Project structure and files ## The `pyproject.toml` Python project metadata is defined in a [`pyproject.toml`](https://packaging.python.org/en/latest/guides/writing-pyproject-toml/) file. uv requires this file to identify the root directory of a project. !!! tip `uv init` can be used to create a new project. See [Creating projects](./init.md) for details. A minimal project definition includes a name and version: ```toml title="pyproject.toml" [project] name = "example" version = "0.1.0" ``` Additional project metadata and configuration includes: - [Python version requirement](./config.md#python-version-requirement) - [Dependencies](./dependencies.md) - [Build system](./config.md#build-systems) - [Entry points (commands)](./config.md#entry-points) ## The project environment When working on a project with uv, uv will create a virtual environment as needed. While some uv commands will create a temporary environment (e.g., `uv run --isolated`), uv also manages a persistent environment with the project and its dependencies in a `.venv` directory next to the `pyproject.toml`. It is stored inside the project to make it easy for editors to find — they need the environment to give code completions and type hints. It is not recommended to include the `.venv` directory in version control; it is automatically excluded from `git` with an internal `.gitignore` file. To run a command in the project environment, use `uv run`. Alternatively the project environment can be activated as normal for a virtual environment. When `uv run` is invoked, it will create the project environment if it does not exist yet or ensure it is up-to-date if it exists. The project environment can also be explicitly created with `uv sync`. See the [locking and syncing](./sync.md) documentation for details. It is _not_ recommended to modify the project environment manually, e.g., with `uv pip install`. For project dependencies, use `uv add` to add a package to the environment. For one-off requirements, use [`uvx`](../../guides/tools.md) or [`uv run --with`](./run.md#requesting-additional-dependencies). !!! tip If you don't want uv to manage the project environment, set [`managed = false`](../../reference/settings.md#managed) to disable automatic locking and syncing of the project. For example: ```toml title="pyproject.toml" [tool.uv] managed = false ``` ## The lockfile uv creates a `uv.lock` file next to the `pyproject.toml`. `uv.lock` is a _universal_ or _cross-platform_ lockfile that captures the packages that would be installed across all possible Python markers such as operating system, architecture, and Python version. Unlike the `pyproject.toml`, which is used to specify the broad requirements of your project, the lockfile contains the exact resolved versions that are installed in the project environment. This file should be checked into version control, allowing for consistent and reproducible installations across machines. A lockfile ensures that developers working on the project are using a consistent set of package versions. Additionally, it ensures when deploying the project as an application that the exact set of used package versions is known. The lockfile is [automatically created and updated](./sync.md#automatic-lock-and-sync) during uv invocations that use the project environment, i.e., `uv sync` and `uv run`. The lockfile may also be explicitly updated using `uv lock`. `uv.lock` is a human-readable TOML file but is managed by uv and should not be edited manually. The `uv.lock` format is specific to uv and not usable by other tools. ### `pylock.toml` In [PEP 751](https://peps.python.org/pep-0751/), Python standardized a new resolution file format, `pylock.toml`. `pylock.toml` is a resolution output format intended to replace `requirements.txt` (e.g., in the context of `uv pip compile`, whereby a "locked" `requirements.txt` file is generated from a set of input requirements). `pylock.toml` is standardized and tool-agnostic, such that in the future, `pylock.toml` files generated by uv could be installed by other tools, and vice versa. Some of uv's functionality cannot be expressed in the `pylock.toml` format; as such, uv will continue to use the `uv.lock` format within the project interface. However, uv supports `pylock.toml` as an export target and in the `uv pip` CLI. For example: - To export a `uv.lock` to the `pylock.toml` format, run: `uv export -o pylock.toml` - To generate a `pylock.toml` file from a set of requirements, run: `uv pip compile -o pylock.toml -r requirements.in` - To install from a `pylock.toml` file, run: `uv pip sync pylock.toml` or `uv pip install -r pylock.toml` # Running commands in projects When working on a project, it is installed into the virtual environment at `.venv`. This environment is isolated from the current shell by default, so invocations that require the project, e.g., `python -c "import example"`, will fail. Instead, use `uv run` to run commands in the project environment: ```console $ uv run python -c "import example" ``` When using `run`, uv will ensure that the project environment is up-to-date before running the given command. The given command can be provided by the project environment or exist outside of it, e.g.: ```console $ # Presuming the project provides `example-cli` $ uv run example-cli foo $ # Running a `bash` script that requires the project to be available $ uv run bash scripts/foo.sh ``` ## Requesting additional dependencies Additional dependencies or different versions of dependencies can be requested per invocation. The `--with` option is used to include a dependency for the invocation, e.g., to request a different version of `httpx`: ```console $ uv run --with httpx==0.26.0 python -c "import httpx; print(httpx.__version__)" 0.26.0 $ uv run --with httpx==0.25.0 python -c "import httpx; print(httpx.__version__)" 0.25.0 ``` The requested version will be respected regardless of the project's requirements. For example, even if the project requires `httpx==0.24.0`, the output above would be the same. ## Running scripts Scripts that declare inline metadata are automatically executed in environments isolated from the project. See the [scripts guide](../../guides/scripts.md#declaring-script-dependencies) for more details. For example, given a script: ```python title="example.py" # /// script # dependencies = [ # "httpx", # ] # /// import httpx resp = httpx.get("https://peps.python.org/api/peps.json") data = resp.json() print([(k, v["title"]) for k, v in data.items()][:10]) ``` The invocation `uv run example.py` would run _isolated_ from the project with only the given dependencies listed. ## Legacy Windows Scripts Support is provided for [legacy setuptools scripts](https://packaging.python.org/en/latest/guides/distributing-packages-using-setuptools/#scripts). These types of scripts are additional files installed by setuptools in `.venv\Scripts`. Currently only legacy scripts with the `.ps1`, `.cmd`, and `.bat` extensions are supported. For example, below is an example running a Command Prompt script. ```console $ uv run --with nuitka==2.6.7 -- nuitka.cmd --version ``` In addition, you don't need to specify the extension. `uv` will automatically look for files ending in `.ps1`, `.cmd`, and `.bat` in that order of execution on your behalf. ```console $ uv run --with nuitka==2.6.7 -- nuitka --version ``` ## Signal handling uv does not cede control of the process to the spawned command in order to provide better error messages on failure. Consequently, uv is responsible for forwarding some signals to the child process the requested command runs in. On Unix systems, uv will forward SIGINT and SIGTERM to the child process. Since terminals send SIGINT to the foreground process group on Ctrl-C, uv will only forward a SIGINT to the child process if it is sent more than once or the child process group differs from uv's. On Windows, these concepts do not apply and uv ignores Ctrl-C events, deferring handling to the child process so it can exit cleanly. # Locking and syncing Locking is the process of resolving your project's dependencies into a [lockfile](./layout.md#the-lockfile). Syncing is the process of installing a subset of packages from the lockfile into the [project environment](./layout.md#the-project-environment). ## Automatic lock and sync Locking and syncing are _automatic_ in uv. For example, when `uv run` is used, the project is locked and synced before invoking the requested command. This ensures the project environment is always up-to-date. Similarly, commands which read the lockfile, such as `uv tree`, will automatically update it before running. To disable automatic locking, use the `--locked` option: ```console $ uv run --locked ... ``` If the lockfile is not up-to-date, uv will raise an error instead of updating the lockfile. To use the lockfile without checking if it is up-to-date, use the `--frozen` option: ```console $ uv run --frozen ... ``` Similarly, to run a command without checking if the environment is up-to-date, use the `--no-sync` option: ```console $ uv run --no-sync ... ``` ## Checking if the lockfile is up-to-date When considering if the lockfile is up-to-date, uv will check if it matches the project metadata. For example, if you add a dependency to your `pyproject.toml`, the lockfile will be considered outdated. Similarly, if you change the version constraints for a dependency such that the locked version is excluded, the lockfile will be considered outdated. However, if you change the version constraints such that the existing locked version is still included, the lockfile will still be considered up-to-date. You can check if the lockfile is up-to-date by passing the `--check` flag to `uv lock`: ```console $ uv lock --check ``` This is equivalent to the `--locked` flag for other commands. !!! important uv will not consider lockfiles outdated when new versions of packages are released — the lockfile needs to be explicitly updated if you want to upgrade dependencies. See the documentation on [upgrading locked package versions](#upgrading-locked-package-versions) for details. ## Creating the lockfile While the lockfile is created [automatically](#automatic-lock-and-sync), the lockfile may also be explicitly created or updated using `uv lock`: ```console $ uv lock ``` ## Syncing the environment While the environment is synced [automatically](#automatic-lock-and-sync), it may also be explicitly synced using `uv sync`: ```console $ uv sync ``` Syncing the environment manually is especially useful for ensuring your editor has the correct versions of dependencies. ### Editable installation When the environment is synced, uv will install the project (and other workspace members) as _editable_ packages, such that re-syncing is not necessary for changes to be reflected in the environment. To opt-out of this behavior, use the `--no-editable` option. !!! note If the project does not define a build system, it will not be installed. See the [build systems](./config.md#build-systems) documentation for details. ### Retaining extraneous packages Syncing is "exact" by default, which means it will remove any packages that are not present in the lockfile. To retain extraneous packages, use the `--inexact` option: ```console $ uv sync --inexact ``` ### Syncing optional dependencies uv reads optional dependencies from the `[project.optional-dependencies]` table. These are frequently referred to as "extras". uv does not sync extras by default. Use the `--extra` option to include an extra. ```console $ uv sync --extra foo ``` To quickly enable all extras, use the `--all-extras` option. See the [optional dependencies](./dependencies.md#optional-dependencies) documentation for details on how to manage optional dependencies. ### Syncing development dependencies uv reads development dependencies from the `[dependency-groups]` table (as defined in [PEP 735](https://peps.python.org/pep-0735/)). The `dev` group is special-cased and synced by default. See the [default groups](./dependencies.md#default-groups) documentation for details on changing the defaults. The `--no-dev` flag can be used to exclude the `dev` group. The `--only-dev` flag can be used to install the `dev` group _without_ the project and its dependencies. Additional groups can be included or excluded with the `--all-groups`, `--no-default-groups`, `--group `, `--only-group `, and `--no-group ` options. The semantics of `--only-group` are the same as `--only-dev`, the project will not be included. However, `--only-group` will also exclude default groups. Group exclusions always take precedence over inclusions, so given the command: ``` $ uv sync --no-group foo --group foo ``` The `foo` group would not be installed. See the [development dependencies](./dependencies.md#development-dependencies) documentation for details on how to manage development dependencies. ## Upgrading locked package versions With an existing `uv.lock` file, uv will prefer the previously locked versions of packages when running `uv sync` and `uv lock`. Package versions will only change if the project's dependency constraints exclude the previous, locked version. To upgrade all packages: ```console $ uv lock --upgrade ``` To upgrade a single package to the latest version, while retaining the locked versions of all other packages: ```console $ uv lock --upgrade-package ``` To upgrade a single package to a specific version: ```console $ uv lock --upgrade-package == ``` In all cases, upgrades are limited to the project's dependency constraints. For example, if the project defines an upper bound for a package then an upgrade will not go beyond that version. !!! note uv applies similar logic to Git dependencies. For example, if a Git dependency references the `main` branch, uv will prefer the locked commit SHA in an existing `uv.lock` file over the latest commit on the `main` branch, unless the `--upgrade` or `--upgrade-package` flags are used. These flags can also be provided to `uv sync` or `uv run` to update the lockfile _and_ the environment. ## Exporting the lockfile If you need to integrate uv with other tools or workflows, you can export `uv.lock` to the `requirements.txt` format with `uv export --format requirements-txt`. The generated `requirements.txt` file can then be installed via `uv pip install`, or with other tools like `pip`. In general, we recommend against using both a `uv.lock` and a `requirements.txt` file. If you find yourself exporting a `uv.lock` file, consider opening an issue to discuss your use case. ## Partial installations Sometimes it's helpful to perform installations in multiple steps, e.g., for optimal layer caching while building a Docker image. `uv sync` has several flags for this purpose. - `--no-install-project`: Do not install the current project - `--no-install-workspace`: Do not install any workspace members, including the root project - `--no-install-package `: Do not install the given package(s) When these options are used, all the dependencies of the target are still installed. For example, `--no-install-project` will omit the _project_ but not any of its dependencies. If used improperly, these flags can result in a broken environment since a package can be missing its dependencies. # Using workspaces Inspired by the [Cargo](https://doc.rust-lang.org/cargo/reference/workspaces.html) concept of the same name, a workspace is "a collection of one or more packages, called _workspace members_, that are managed together." Workspaces organize large codebases by splitting them into multiple packages with common dependencies. Think: a FastAPI-based web application, alongside a series of libraries that are versioned and maintained as separate Python packages, all in the same Git repository. In a workspace, each package defines its own `pyproject.toml`, but the workspace shares a single lockfile, ensuring that the workspace operates with a consistent set of dependencies. As such, `uv lock` operates on the entire workspace at once, while `uv run` and `uv sync` operate on the workspace root by default, though both accept a `--package` argument, allowing you to run a command in a particular workspace member from any workspace directory. ## Getting started To create a workspace, add a `tool.uv.workspace` table to a `pyproject.toml`, which will implicitly create a workspace rooted at that package. !!! tip By default, running `uv init` inside an existing package will add the newly created member to the workspace, creating a `tool.uv.workspace` table in the workspace root if it doesn't already exist. In defining a workspace, you must specify the `members` (required) and `exclude` (optional) keys, which direct the workspace to include or exclude specific directories as members respectively, and accept lists of globs: ```toml title="pyproject.toml" [project] name = "albatross" version = "0.1.0" requires-python = ">=3.12" dependencies = ["bird-feeder", "tqdm>=4,<5"] [tool.uv.sources] bird-feeder = { workspace = true } [tool.uv.workspace] members = ["packages/*"] exclude = ["packages/seeds"] ``` Every directory included by the `members` globs (and not excluded by the `exclude` globs) must contain a `pyproject.toml` file. However, workspace members can be _either_ [applications](./init.md#applications) or [libraries](./init.md#libraries); both are supported in the workspace context. Every workspace needs a root, which is _also_ a workspace member. In the above example, `albatross` is the workspace root, and the workspace members include all projects under the `packages` directory, except `seeds`. By default, `uv run` and `uv sync` operates on the workspace root. For example, in the above example, `uv run` and `uv run --package albatross` would be equivalent, while `uv run --package bird-feeder` would run the command in the `bird-feeder` package. ## Workspace sources Within a workspace, dependencies on workspace members are facilitated via [`tool.uv.sources`](./dependencies.md), as in: ```toml title="pyproject.toml" [project] name = "albatross" version = "0.1.0" requires-python = ">=3.12" dependencies = ["bird-feeder", "tqdm>=4,<5"] [tool.uv.sources] bird-feeder = { workspace = true } [tool.uv.workspace] members = ["packages/*"] [build-system] requires = ["uv_build>=0.8.4,<0.9.0"] build-backend = "uv_build" ``` In this example, the `albatross` project depends on the `bird-feeder` project, which is a member of the workspace. The `workspace = true` key-value pair in the `tool.uv.sources` table indicates the `bird-feeder` dependency should be provided by the workspace, rather than fetched from PyPI or another registry. !!! note Dependencies between workspace members are editable. Any `tool.uv.sources` definitions in the workspace root apply to all members, unless overridden in the `tool.uv.sources` of a specific member. For example, given the following `pyproject.toml`: ```toml title="pyproject.toml" [project] name = "albatross" version = "0.1.0" requires-python = ">=3.12" dependencies = ["bird-feeder", "tqdm>=4,<5"] [tool.uv.sources] bird-feeder = { workspace = true } tqdm = { git = "https://github.com/tqdm/tqdm" } [tool.uv.workspace] members = ["packages/*"] [build-system] requires = ["uv_build>=0.8.4,<0.9.0"] build-backend = "uv_build" ``` Every workspace member would, by default, install `tqdm` from GitHub, unless a specific member overrides the `tqdm` entry in its own `tool.uv.sources` table. !!! note If a workspace member provides `tool.uv.sources` for some dependency, it will ignore any `tool.uv.sources` for the same dependency in the workspace root, even if the member's source is limited by a [marker](dependencies.md#platform-specific-sources) that doesn't match the current platform. ## Workspace layouts The most common workspace layout can be thought of as a root project with a series of accompanying libraries. For example, continuing with the above example, this workspace has an explicit root at `albatross`, with two libraries (`bird-feeder` and `seeds`) in the `packages` directory: ```text albatross ├── packages │ ├── bird-feeder │ │ ├── pyproject.toml │ │ └── src │ │ └── bird_feeder │ │ ├── __init__.py │ │ └── foo.py │ └── seeds │ ├── pyproject.toml │ └── src │ └── seeds │ ├── __init__.py │ └── bar.py ├── pyproject.toml ├── README.md ├── uv.lock └── src └── albatross └── main.py ``` Since `seeds` was excluded in the `pyproject.toml`, the workspace has two members total: `albatross` (the root) and `bird-feeder`. ## When (not) to use workspaces Workspaces are intended to facilitate the development of multiple interconnected packages within a single repository. As a codebase grows in complexity, it can be helpful to split it into smaller, composable packages, each with their own dependencies and version constraints. Workspaces help enforce isolation and separation of concerns. For example, in uv, we have separate packages for the core library and the command-line interface, enabling us to test the core library independently of the CLI, and vice versa. Other common use cases for workspaces include: - A library with a performance-critical subroutine implemented in an extension module (Rust, C++, etc.). - A library with a plugin system, where each plugin is a separate workspace package with a dependency on the root. Workspaces are _not_ suited for cases in which members have conflicting requirements, or desire a separate virtual environment for each member. In this case, path dependencies are often preferable. For example, rather than grouping `albatross` and its members in a workspace, you can always define each package as its own independent project, with inter-package dependencies defined as path dependencies in `tool.uv.sources`: ```toml title="pyproject.toml" [project] name = "albatross" version = "0.1.0" requires-python = ">=3.12" dependencies = ["bird-feeder", "tqdm>=4,<5"] [tool.uv.sources] bird-feeder = { path = "packages/bird-feeder" } [build-system] requires = ["uv_build>=0.8.4,<0.9.0"] build-backend = "uv_build" ``` This approach conveys many of the same benefits, but allows for more fine-grained control over dependency resolution and virtual environment management (with the downside that `uv run --package` is no longer available; instead, commands must be run from the relevant package directory). Finally, uv's workspaces enforce a single `requires-python` for the entire workspace, taking the intersection of all members' `requires-python` values. If you need to support testing a given member on a Python version that isn't supported by the rest of the workspace, you may need to use `uv pip` to install that member in a separate virtual environment. !!! note As Python does not provide dependency isolation, uv can't ensure that a package uses its declared dependencies and nothing else. For workspaces specifically, uv can't ensure that packages don't import dependencies declared by another workspace member. # Python versions A Python version is composed of a Python interpreter (i.e. the `python` executable), the standard library, and other supporting files. ## Managed and system Python installations Since it is common for a system to have an existing Python installation, uv supports [discovering](#discovery-of-python-versions) Python versions. However, uv also supports [installing Python versions](#installing-a-python-version) itself. To distinguish between these two types of Python installations, uv refers to Python versions it installs as _managed_ Python installations and all other Python installations as _system_ Python installations. !!! note uv does not distinguish between Python versions installed by the operating system vs those installed and managed by other tools. For example, if a Python installation is managed with `pyenv`, it would still be considered a _system_ Python version in uv. ## Requesting a version A specific Python version can be requested with the `--python` flag in most uv commands. For example, when creating a virtual environment: ```console $ uv venv --python 3.11.6 ``` uv will ensure that Python 3.11.6 is available — downloading and installing it if necessary — then create the virtual environment with it. The following Python version request formats are supported: - `` (e.g., `3`, `3.12`, `3.12.3`) - `` (e.g., `>=3.12,<3.13`) - `` (e.g., `cpython` or `cp`) - `@` (e.g., `cpython@3.12`) - `` (e.g., `cpython3.12` or `cp312`) - `` (e.g., `cpython>=3.12,<3.13`) - `----` (e.g., `cpython-3.12.3-macos-aarch64-none`) Additionally, a specific system Python interpreter can be requested with: - `` (e.g., `/opt/homebrew/bin/python3`) - `` (e.g., `mypython3`) - `` (e.g., `/some/environment/`) By default, uv will automatically download Python versions if they cannot be found on the system. This behavior can be [disabled with the `python-downloads` option](#disabling-automatic-python-downloads). ### Python version files The `.python-version` file can be used to create a default Python version request. uv searches for a `.python-version` file in the working directory and each of its parents. If none is found, uv will check the user-level configuration directory. Any of the request formats described above can be used, though use of a version number is recommended for interoperability with other tools. A `.python-version` file can be created in the current directory with the [`uv python pin`](../reference/cli.md/#uv-python-pin) command. A global `.python-version` file can be created in the user configuration directory with the [`uv python pin --global`](../reference/cli.md/#uv-python-pin) command. Discovery of `.python-version` files can be disabled with `--no-config`. uv will not search for `.python-version` files beyond project or workspace boundaries (except the user configuration directory). ## Installing a Python version uv bundles a list of downloadable CPython and PyPy distributions for macOS, Linux, and Windows. !!! tip By default, Python versions are automatically downloaded as needed without using `uv python install`. To install a Python version at a specific version: ```console $ uv python install 3.12.3 ``` To install the latest patch version: ```console $ uv python install 3.12 ``` To install a version that satisfies constraints: ```console $ uv python install '>=3.8,<3.10' ``` To install multiple versions: ```console $ uv python install 3.9 3.10 3.11 ``` To install a specific implementation: ```console $ uv python install pypy ``` All the [Python version request](#requesting-a-version) formats are supported except those that are used for requesting local interpreters such as a file path. By default `uv python install` will verify that a managed Python version is installed or install the latest version. If a `.python-version` file is present, uv will install the Python version listed in the file. A project that requires multiple Python versions may define a `.python-versions` file. If present, uv will install all the Python versions listed in the file. !!! important The available Python versions are frozen for each uv release. To install new Python versions, you may need upgrade uv. ### Installing Python executables uv installs Python executables into your `PATH` by default, e.g., `uv python install 3.12` will install a Python executable into `~/.local/bin`, e.g., as `python3.12`. !!! tip If `~/.local/bin` is not in your `PATH`, you can add it with `uv tool update-shell`. To install `python` and `python3` executables, include the experimental `--default` option: ```console $ uv python install 3.12 --default ``` When installing Python executables, uv will only overwrite an existing executable if it is managed by uv — e.g., if `~/.local/bin/python3.12` exists already uv will not overwrite it without the `--force` flag. uv will update executables that it manages. However, it will prefer the latest patch version of each Python minor version by default. For example: ```console $ uv python install 3.12.7 # Adds `python3.12` to `~/.local/bin` $ uv python install 3.12.6 # Does not update `python3.12` $ uv python install 3.12.8 # Updates `python3.12` to point to 3.12.8 ``` ## Upgrading Python versions !!! important Support for upgrading Python versions is in _preview_. This means the behavior is experimental and subject to change. Upgrades are only supported for uv-managed Python versions. Upgrades are not currently supported for PyPy and GraalPy. uv allows transparently upgrading Python versions to the latest patch release, e.g., 3.13.4 to 3.13.5. uv does not allow transparently upgrading across minor Python versions, e.g., 3.12 to 3.13, because changing minor versions can affect dependency resolution. uv-managed Python versions can be upgraded to the latest supported patch release with the `python upgrade` command: To upgrade a Python version to the latest supported patch release: ```console $ uv python upgrade 3.12 ``` To upgrade all installed Python versions: ```console $ uv python upgrade ``` After an upgrade, uv will prefer the new version, but will retain the existing version as it may still be used by virtual environments. If the Python version was installed with the `python-upgrade` [preview feature](./preview.md) enabled, e.g., `uv python install 3.12 --preview-features python-upgrade`, virtual environments using the Python version will be automatically upgraded to the new patch version. !!! note If the virtual environment was created _before_ opting in to the preview mode, it will not be included in the automatic upgrades. If a virtual environment was created with an explicitly requested patch version, e.g., `uv venv -p 3.10.8`, it will not be transparently upgraded to a new version. ### Minor version directories Automatic upgrades for virtual environments are implemented using a directory with the Python minor version, e.g.: ``` ~/.local/share/uv/python/cpython-3.12-macos-aarch64-none ``` which is a symbolic link (on Unix) or junction (on Windows) pointing to a specific patch version: ```console $ readlink ~/.local/share/uv/python/cpython-3.12-macos-aarch64-none ~/.local/share/uv/python/cpython-3.12.11-macos-aarch64-none ``` If this link is resolved by another tool, e.g., by canonicalizing the Python interpreter path, and used to create a virtual environment, it will not be automatically upgraded. ## Project Python versions uv will respect Python requirements defined in `requires-python` in the `pyproject.toml` file during project command invocations. The first Python version that is compatible with the requirement will be used, unless a version is otherwise requested, e.g., via a `.python-version` file or the `--python` flag. ## Viewing available Python versions To list installed and available Python versions: ```console $ uv python list ``` To filter the Python versions, provide a request, e.g., to show all Python 3.13 interpreters: ```console $ uv python list 3.13 ``` Or, to show all PyPy interpreters: ```console $ uv python list pypy ``` By default, downloads for other platforms and old patch versions are hidden. To view all versions: ```console $ uv python list --all-versions ``` To view Python versions for other platforms: ```console $ uv python list --all-platforms ``` To exclude downloads and only show installed Python versions: ```console $ uv python list --only-installed ``` See the [`uv python list`](../reference/cli.md#uv-python-list) reference for more details. ## Finding a Python executable To find a Python executable, use the `uv python find` command: ```console $ uv python find ``` By default, this will display the path to the first available Python executable. See the [discovery rules](#discovery-of-python-versions) for details about how executables are discovered. This interface also supports many [request formats](#requesting-a-version), e.g., to find a Python executable that has a version of 3.11 or newer: ```console $ uv python find '>=3.11' ``` By default, `uv python find` will include Python versions from virtual environments. If a `.venv` directory is found in the working directory or any of the parent directories or the `VIRTUAL_ENV` environment variable is set, it will take precedence over any Python executables on the `PATH`. To ignore virtual environments, use the `--system` flag: ```console $ uv python find --system ``` ## Discovery of Python versions When searching for a Python version, the following locations are checked: - Managed Python installations in the `UV_PYTHON_INSTALL_DIR`. - A Python interpreter on the `PATH` as `python`, `python3`, or `python3.x` on macOS and Linux, or `python.exe` on Windows. - On Windows, the Python interpreters in the Windows registry and Microsoft Store Python interpreters (see `py --list-paths`) that match the requested version. In some cases, uv allows using a Python version from a virtual environment. In this case, the virtual environment's interpreter will be checked for compatibility with the request before searching for an installation as described above. See the [pip-compatible virtual environment discovery](../pip/environments.md#discovery-of-python-environments) documentation for details. When performing discovery, non-executable files will be ignored. Each discovered executable is queried for metadata to ensure it meets the [requested Python version](#requesting-a-version). If the query fails, the executable will be skipped. If the executable satisfies the request, it is used without inspecting additional executables. When searching for a managed Python version, uv will prefer newer versions first. When searching for a system Python version, uv will use the first compatible version — not the newest version. If a Python version cannot be found on the system, uv will check for a compatible managed Python version download. ### Python pre-releases Python pre-releases will not be selected by default. Python pre-releases will be used if there is no other available installation matching the request. For example, if only a pre-release version is available it will be used but otherwise a stable release version will be used. Similarly, if the path to a pre-release Python executable is provided then no other Python version matches the request and the pre-release version will be used. If a pre-release Python version is available and matches the request, uv will not download a stable Python version instead. ## Disabling automatic Python downloads By default, uv will automatically download Python versions when needed. The [`python-downloads`](../reference/settings.md#python-downloads) option can be used to disable this behavior. By default, it is set to `automatic`; set to `manual` to only allow Python downloads during `uv python install`. !!! tip The `python-downloads` setting can be set in a [persistent configuration file](./configuration-files.md) to change the default behavior, or the `--no-python-downloads` flag can be passed to any uv command. ## Requiring or disabling managed Python versions By default, uv will attempt to use Python versions found on the system and only download managed Python versions when necessary. To ignore system Python versions, and only use managed Python versions, use the `--managed-python` flag: ```console $ uv python list --managed-python ``` Similarly, to ignore managed Python versions and only use system Python versions, use the `--no-managed-python` flag: ```console $ uv python list --no-managed-python ``` To change uv's default behavior in a configuration file, use the [`python-preference` setting](#adjusting-python-version-preferences). ## Adjusting Python version preferences The [`python-preference`](../reference/settings.md#python-preference) setting determines whether to prefer using Python installations that are already present on the system, or those that are downloaded and installed by uv. By default, the `python-preference` is set to `managed` which prefers managed Python installations over system Python installations. However, system Python installations are still preferred over downloading a managed Python version. The following alternative options are available: - `only-managed`: Only use managed Python installations; never use system Python installations. Equivalent to `--managed-python`. - `system`: Prefer system Python installations over managed Python installations. - `only-system`: Only use system Python installations; never use managed Python installations. Equivalent to `--no-managed-python`. !!! note Automatic Python version downloads can be [disabled](#disabling-automatic-python-downloads) without changing the preference. ## Python implementation support uv supports the CPython, PyPy, and GraalPy Python implementations. If a Python implementation is not supported, uv will fail to discover its interpreter. The implementations may be requested with either the long or short name: - CPython: `cpython`, `cp` - PyPy: `pypy`, `pp` - GraalPy: `graalpy`, `gp` Implementation name requests are not case-sensitive. See the [Python version request](#requesting-a-version) documentation for more details on the supported formats. ## Managed Python distributions uv supports downloading and installing CPython and PyPy distributions. ### CPython distributions As Python does not publish official distributable CPython binaries, uv instead uses pre-built distributions from the Astral [`python-build-standalone`](https://github.com/astral-sh/python-build-standalone) project. `python-build-standalone` is also is used in many other Python projects, like [Rye](https://github.com/astral-sh/rye), [Mise](https://mise.jdx.dev/lang/python.html), and [bazelbuild/rules_python](https://github.com/bazelbuild/rules_python). The uv Python distributions are self-contained, highly-portable, and performant. While Python can be built from source, as in tools like `pyenv`, doing so requires preinstalled system dependencies, and creating optimized, performant builds (e.g., with PGO and LTO enabled) is very slow. These distributions have some behavior quirks, generally as a consequence of portability; see the [`python-build-standalone` quirks](https://gregoryszorc.com/docs/python-build-standalone/main/quirks.html) documentation for details. Additionally, some platforms may not be supported (e.g., distributions are not yet available for musl Linux on ARM). ### PyPy distributions PyPy distributions are provided by the PyPy project. ## Registration in the Windows registry On Windows, installation of managed Python versions will register them with the Windows registry as defined by [PEP 514](https://peps.python.org/pep-0514/). After installation, the Python versions can be selected with the `py` launcher, e.g.: ```console $ uv python install 3.13.1 $ py -V:Astral/CPython3.13.1 ``` On uninstall, uv will remove the registry entry for the target version as well as any broken registry entries. # Resolution Resolution is the process of taking a list of requirements and converting them to a list of package versions that fulfill the requirements. Resolution requires recursively searching for compatible versions of packages, ensuring that the requested requirements are fulfilled and that the requirements of the requested packages are compatible. ## Dependencies Most projects and packages have dependencies. Dependencies are other packages that are necessary in order for the current package to work. A package defines its dependencies as _requirements_, roughly a combination of a package name and acceptable versions. The dependencies defined by the current project are called _direct dependencies_. The dependencies added by each dependency of the current project are called _indirect_ or _transitive dependencies_. !!! note See the [dependency specifiers page](https://packaging.python.org/en/latest/specifications/dependency-specifiers/) in the Python Packaging documentation for details about dependencies. ## Basic examples To help demonstrate the resolution process, consider the following dependencies: - The project depends on `foo` and `bar`. - `foo` has one version, 1.0.0: - `foo 1.0.0` depends on `lib>=1.0.0`. - `bar` has one version, 1.0.0: - `bar 1.0.0` depends on `lib>=2.0.0`. - `lib` has two versions, 1.0.0 and 2.0.0. Both versions have no dependencies. In this example, the resolver must find a set of package versions which satisfies the project requirements. Since there is only one version of both `foo` and `bar`, those will be used. The resolution must also include the transitive dependencies, so a version of `lib` must be chosen. `foo 1.0.0` allows all available versions of `lib`, but `bar 1.0.0` requires `lib>=2.0.0` so `lib 2.0.0` must be used. In some resolutions, there may be more than one valid solution. Consider the following dependencies: - The project depends on `foo` and `bar`. - `foo` has two versions, 1.0.0 and 2.0.0: - `foo 1.0.0` has no dependencies. - `foo 2.0.0` depends on `lib==2.0.0`. - `bar` has two versions, 1.0.0 and 2.0.0: - `bar 1.0.0` has no dependencies. - `bar 2.0.0` depends on `lib==1.0.0` - `lib` has two versions, 1.0.0 and 2.0.0. Both versions have no dependencies. In this example, some version of both `foo` and `bar` must be selected; however, determining which version requires considering the dependencies of each version of `foo` and `bar`. `foo 2.0.0` and `bar 2.0.0` cannot be installed together as they conflict on their required version of `lib`, so the resolver must select either `foo 1.0.0` (along with `bar 2.0.0`) or `bar 1.0.0` (along with `foo 1.0.0`). Both are valid solutions, and different resolution algorithms may yield either result. ## Platform markers Markers allow attaching an expression to requirements that indicate when the dependency should be used. For example `bar ; python_version < "3.9"` indicates that `bar` should only be installed on Python 3.8 and earlier. Markers are used to adjust a package's dependencies based on the current environment or platform. For example, markers can be used to modify dependencies by operating system, CPU architecture, Python version, Python implementation, and more. !!! note See the [environment markers](https://packaging.python.org/en/latest/specifications/dependency-specifiers/#environment-markers) section in the Python Packaging documentation for more details about markers. Markers are important for resolution because their values change the required dependencies. Typically, Python package resolvers use the markers of the _current_ platform to determine which dependencies to use since the package is often being _installed_ on the current platform. However, for _locking_ dependencies this is problematic — the lockfile would only work for developers using the same platform the lockfile was created on. To solve this problem, platform-independent, or "universal" resolvers exist. uv supports both [platform-specific](#platform-specific-resolution) and [universal](#universal-resolution) resolution. ## Platform-specific resolution By default, uv's pip interface, i.e., [`uv pip compile`](../pip/compile.md), produces a resolution that is platform-specific, like `pip-tools`. There is no way to use platform-specific resolution in the uv's project interface. uv also supports resolving for specific, alternate platforms and Python versions with the `--python-platform` and `--python-version` options. For example, if using Python 3.12 on macOS, `uv pip compile --python-platform linux --python-version 3.10 requirements.in` can be used to produce a resolution for Python 3.10 on Linux instead. Unlike universal resolution, during platform-specific resolution, the provided `--python-version` is the exact python version to use, not a lower bound. !!! note Python's environment markers expose far more information about the current machine than can be expressed by a simple `--python-platform` argument. For example, the `platform_version` marker on macOS includes the time at which the kernel was built, which can (in theory) be encoded in package requirements. uv's resolver makes a best-effort attempt to generate a resolution that is compatible with any machine running on the target `--python-platform`, which should be sufficient for most use cases, but may lose fidelity for complex package and platform combinations. ## Universal resolution uv's lockfile (`uv.lock`) is created with a universal resolution and is portable across platforms. This ensures that dependencies are locked for everyone working on the project, regardless of operating system, architecture, and Python version. The uv lockfile is created and modified by [project](../concepts/projects/index.md) commands such as `uv lock`, `uv sync`, and `uv add`. Universal resolution is also available in uv's pip interface, i.e., [`uv pip compile`](../pip/compile.md), with the `--universal` flag. The resulting requirements file will contain markers to indicate which platform each dependency is relevant for. During universal resolution, a package may be listed multiple times with different versions or URLs if different versions are needed for different platforms — the markers determine which version will be used. A universal resolution is often more constrained than a platform-specific resolution, since we need to take the requirements for all markers into account. During universal resolution, all required packages must be compatible with the _entire_ range of `requires-python` declared in the `pyproject.toml`. For example, if a project's `requires-python` is `>=3.8`, resolution will fail if all versions of given dependency require Python 3.9 or later, since the dependency lacks a usable version for (e.g.) Python 3.8, the lower bound of the project's supported range. In other words, the project's `requires-python` must be a subset of the `requires-python` of all its dependencies. When selecting the compatible version for a given dependency, uv will ([by default](#multi-version-resolution)) attempt to choose the latest compatible version for each supported Python version. For example, if a project's `requires-python` is `>=3.8`, and the latest version of a dependency requires Python 3.9 or later, while all prior versions supporting Python 3.8, the resolver will select the latest version for users running Python 3.9 or later, and previous versions for users running Python 3.8. When evaluating `requires-python` ranges for dependencies, uv only considers lower bounds and ignores upper bounds entirely. For example, `>=3.8, <4` is treated as `>=3.8`. Respecting upper bounds on `requires-python` often leads to formally correct but practically incorrect resolutions, as, e.g., resolvers will backtrack to the first published version that omits the upper bound (see: [`Requires-Python` upper limits](https://discuss.python.org/t/requires-python-upper-limits/12663)). ### Limited resolution environments By default, the universal resolver attempts to solve for all platforms and Python versions. If your project supports only a limited set of platforms or Python versions, you can constrain the set of solved platforms via the `environments` setting, which accepts a list of [PEP 508 environment markers](https://packaging.python.org/en/latest/specifications/dependency-specifiers/#environment-markers). In other words, you can use the `environments` setting to _reduce_ the set of supported platforms. For example, to constrain the lockfile to macOS and Linux, and avoid solving for Windows: ```toml title="pyproject.toml" [tool.uv] environments = [ "sys_platform == 'darwin'", "sys_platform == 'linux'", ] ``` Or, to avoid solving for alternative Python implementations: ```toml title="pyproject.toml" [tool.uv] environments = [ "implementation_name == 'cpython'" ] ``` Entries in the `environments` setting must be disjoint (i.e., they must not overlap). For example, `sys_platform == 'darwin'` and `sys_platform == 'linux'` are disjoint, but `sys_platform == 'darwin'` and `python_version >= '3.9'` are not, since both could be true at the same time. ### Required environments In the Python ecosystem, packages can be published as source distributions, built distributions (wheels), or both; but to install a package, a built distribution is required. If a package lacks a built distribution, or lacks a distribution for the current platform or Python version (built distributions are often platform-specific), uv will attempt to build the package from source, then install the resulting built distribution. Some packages (like PyTorch) publish built distributions, but omit a source distribution. Such packages are _only_ installable on platforms for which a built distribution is available. For example, if a package publishes built distributions for Linux, but not macOS or Windows, then that package will _only_ be installable on Linux. Packages that lack source distributions cause problems for universal resolution, since there will typically be at least one platform or Python version for which the package is not installable. By default, uv requires each such package to include at least one wheel that is compatible with the target Python version. The `required-environments` setting can be used to ensure that the resulting resolution contains wheels for specific platforms, or fails if no such wheels are available. The setting accepts a list of [PEP 508 environment markers](https://packaging.python.org/en/latest/specifications/dependency-specifiers/#environment-markers). While the `environments` setting _limits_ the set of environments that uv will consider when resolving dependencies, `required-environments` _expands_ the set of platforms that uv _must_ support when resolving dependencies. For example, `environments = ["sys_platform == 'darwin'"]` would limit uv to solving for macOS (and ignoring Linux and Windows). On the other hand, `required-environments = ["sys_platform == 'darwin'"]` would _require_ that any package without a source distribution include a wheel for macOS in order to be installable (and would fail if no such wheel is available). In practice, `required-environments` can be useful for declaring explicit support for non-latest platforms, since this often requires backtracking past the latest published versions of those packages. For example, to guarantee that any built distribution-only packages includes support for Intel macOS: ```toml title="pyproject.toml" [tool.uv] required-environments = [ "sys_platform == 'darwin' and platform_machine == 'x86_64'" ] ``` ## Dependency preferences If resolution output file exists, i.e., a uv lockfile (`uv.lock`) or a requirements output file (`requirements.txt`), uv will _prefer_ the dependency versions listed there. Similarly, if installing a package into a virtual environment, uv will prefer the already installed version if present. This means that locked or installed versions will not change unless an incompatible version is requested or an upgrade is explicitly requested with `--upgrade`. ## Resolution strategy By default, uv tries to use the latest version of each package. For example, `uv pip install flask>=2.0.0` will install the latest version of Flask, e.g., 3.0.0. If `flask>=2.0.0` is a dependency of the project, only `flask` 3.0.0 will be used. This is important, for example, because running tests will not check that the project is actually compatible with its stated lower bound of `flask` 2.0.0. With `--resolution lowest`, uv will install the lowest possible version for all dependencies, both direct and indirect (transitive). Alternatively, `--resolution lowest-direct` will use the lowest compatible versions for all direct dependencies, while using the latest compatible versions for all other dependencies. uv will always use the latest versions for build dependencies. For example, given the following `requirements.in` file: ```python title="requirements.in" flask>=2.0.0 ``` Running `uv pip compile requirements.in` would produce the following `requirements.txt` file: ```python title="requirements.txt" # This file was autogenerated by uv via the following command: # uv pip compile requirements.in blinker==1.7.0 # via flask click==8.1.7 # via flask flask==3.0.0 itsdangerous==2.1.2 # via flask jinja2==3.1.2 # via flask markupsafe==2.1.3 # via # jinja2 # werkzeug werkzeug==3.0.1 # via flask ``` However, `uv pip compile --resolution lowest requirements.in` would instead produce: ```python title="requirements.in" # This file was autogenerated by uv via the following command: # uv pip compile requirements.in --resolution lowest click==7.1.2 # via flask flask==2.0.0 itsdangerous==2.0.0 # via flask jinja2==3.0.0 # via flask markupsafe==2.0.0 # via jinja2 werkzeug==2.0.0 # via flask ``` When publishing libraries, it is recommended to separately run tests with `--resolution lowest` or `--resolution lowest-direct` in continuous integration to ensure compatibility with the declared lower bounds. ## Pre-release handling By default, uv will accept pre-release versions during dependency resolution in two cases: 1. If the package is a direct dependency, and its version specifiers include a pre-release specifier (e.g., `flask>=2.0.0rc1`). 1. If _all_ published versions of a package are pre-releases. If dependency resolution fails due to a transitive pre-release, uv will prompt use of `--prerelease allow` to allow pre-releases for all dependencies. Alternatively, the transitive dependency can be added as a [constraint](#dependency-constraints) or direct dependency (i.e. in `requirements.in` or `pyproject.toml`) with a pre-release version specifier (e.g., `flask>=2.0.0rc1`) to opt in to pre-release support for that specific dependency. Pre-releases are [notoriously difficult](https://pubgrub-rs-guide.netlify.app/limitations/prerelease_versions) to model, and are a frequent source of bugs in other packaging tools. uv's pre-release handling is _intentionally_ limited and requires user opt-in for pre-releases to ensure correctness. For more details, see [Pre-release compatibility](../pip/compatibility.md#pre-release-compatibility). ## Multi-version resolution During universal resolution, a package may be listed multiple times with different versions or URLs within the same lockfile, since different versions may be needed for different platforms or Python versions. The `--fork-strategy` setting can be used to control how uv trades off between (1) minimizing the number of selected versions and (2) selecting the latest-possible version for each platform. The former leads to greater consistency across platforms, while the latter leads to use of newer package versions where possible. By default (`--fork-strategy requires-python`), uv will optimize for selecting the latest version of each package for each supported Python version, while minimizing the number of selected versions across platforms. For example, when resolving `numpy` with a Python requirement of `>=3.8`, uv would select the following versions: ```txt numpy==1.24.4 ; python_version == "3.8" numpy==2.0.2 ; python_version == "3.9" numpy==2.2.0 ; python_version >= "3.10" ``` This resolution reflects the fact that NumPy 2.2.0 and later require at least Python 3.10, while earlier versions are compatible with Python 3.8 and 3.9. Under `--fork-strategy fewest`, uv will instead minimize the number of selected versions for each package, preferring older versions that are compatible with a wider range of supported Python versions or platforms. For example, when in the scenario above, uv would select `numpy==1.24.4` for all Python versions, rather than upgrading to `numpy==2.0.2` for Python 3.9 and `numpy==2.2.0` for Python 3.10 and later. ## Dependency constraints Like pip, uv supports constraint files (`--constraint constraints.txt`) which narrow the set of acceptable versions for the given packages. Constraint files are similar to requirements files, but being listed as a constraint alone will not cause a package to be included to the resolution. Instead, constraints only take effect if a requested package is already pulled in as a direct or transitive dependency. Constraints are useful for reducing the range of available versions for a transitive dependency. They can also be used to keep a resolution in sync with some other set of resolved versions, regardless of which packages are overlapping between the two. ## Dependency overrides Dependency overrides allow bypassing unsuccessful or undesirable resolutions by overriding a package's declared dependencies. Overrides are a useful last resort for cases in which you _know_ that a dependency is compatible with a certain version of a package, despite the metadata indicating otherwise. For example, if a transitive dependency declares the requirement `pydantic>=1.0,<2.0`, but _does_ work with `pydantic>=2.0`, the user can override the declared dependency by including `pydantic>=1.0,<3` in the overrides, thereby allowing the resolver to choose a newer version of `pydantic`. Concretely, if `pydantic>=1.0,<3` is included as an override, uv will ignore all declared requirements on `pydantic`, replacing them with the override. In the above example, the `pydantic>=1.0,<2.0` requirement would be ignored completely, and would instead be replaced with `pydantic>=1.0,<3`. While constraints can only _reduce_ the set of acceptable versions for a package, overrides can _expand_ the set of acceptable versions, providing an escape hatch for erroneous upper version bounds. As with constraints, overrides do not add a dependency on the package and only take effect if the package is requested in a direct or transitive dependency. In a `pyproject.toml`, use `tool.uv.override-dependencies` to define a list of overrides. In the pip-compatible interface, the `--override` option can be used to pass files with the same format as constraints files. If multiple overrides are provided for the same package, they must be differentiated with [markers](#platform-markers). If a package has a dependency with a marker, it is replaced unconditionally when using overrides — it does not matter if the marker evaluates to true or false. ## Dependency metadata During resolution, uv needs to resolve the metadata for each package it encounters, in order to determine its dependencies. This metadata is often available as a static file in the package index; however, for packages that only provide source distributions, the metadata may not be available upfront. In such cases, uv has to build the package to determine its metadata (e.g., by invoking `setup.py`). This can introduce a performance penalty during resolution. Further, it imposes the requirement that the package can be built on all platforms, which may not be true. For example, you may have a package that should only be built and installed on Linux, but doesn't build successfully on macOS or Windows. While uv can construct a perfectly valid lockfile for this scenario, doing so would require building the package, which would fail on non-Linux platforms. The `tool.uv.dependency-metadata` table can be used to provide static metadata for such dependencies upfront, thereby allowing uv to skip the build step and use the provided metadata instead. For example, to provide metadata for `chumpy` upfront, include its `dependency-metadata` in the `pyproject.toml`: ```toml [[tool.uv.dependency-metadata]] name = "chumpy" version = "0.70" requires-dist = ["numpy>=1.8.1", "scipy>=0.13.0", "six>=1.11.0"] ``` These declarations are intended for cases in which a package does _not_ declare static metadata upfront, though they are also useful for packages that require disabling build isolation. In such cases, it may be easier to declare the package metadata upfront, rather than creating a custom build environment prior to resolving the package. For example, you can declare the metadata for `flash-attn`, allowing uv to resolve without building the package from source (which itself requires installing `torch`): ```toml [project] name = "project" version = "0.1.0" requires-python = ">=3.12" dependencies = ["flash-attn"] [tool.uv.sources] flash-attn = { git = "https://github.com/Dao-AILab/flash-attention", tag = "v2.6.3" } [[tool.uv.dependency-metadata]] name = "flash-attn" version = "2.6.3" requires-dist = ["torch", "einops"] ``` Like dependency overrides, `tool.uv.dependency-metadata` can also be used for cases in which a package's metadata is incorrect or incomplete, or when a package is not available in the package index. While dependency overrides allow overriding the allowed versions of a package globally, metadata overrides allow overriding the declared metadata of a _specific package_. !!! note The `version` field in `tool.uv.dependency-metadata` is optional for registry-based dependencies (when omitted, uv will assume the metadata applies to all versions of the package), but _required_ for direct URL dependencies (like Git dependencies). Entries in the `tool.uv.dependency-metadata` table follow the [Metadata 2.3](https://packaging.python.org/en/latest/specifications/core-metadata/) specification, though only `name`, `version`, `requires-dist`, `requires-python`, and `provides-extra` are read by uv. The `version` field is also considered optional. If omitted, the metadata will be used for all versions of the specified package. ## Conflicting dependencies uv requires that all optional dependencies ("extras") declared by the project are compatible with each other and resolves all optional dependencies together when creating the lockfile. If optional dependencies declared in one extra are not compatible with those in another extra, uv will fail to resolve the requirements of the project with an error. To work around this, uv supports declaring conflicting extras. For example, consider two sets of optional dependencies that conflict with one another: ```toml title="pyproject.toml" [project.optional-dependencies] extra1 = ["numpy==2.1.2"] extra2 = ["numpy==2.0.0"] ``` If you run `uv lock` with the above dependencies, resolution will fail: ```console $ uv lock x No solution found when resolving dependencies: `-> Because myproject[extra2] depends on numpy==2.0.0 and myproject[extra1] depends on numpy==2.1.2, we can conclude that myproject[extra1] and myproject[extra2] are incompatible. And because your project requires myproject[extra1] and myproject[extra2], we can conclude that your projects's requirements are unsatisfiable. ``` But if you specify that `extra1` and `extra2` are conflicting, uv will resolve them separately. Specify conflicts in the `tool.uv` section: ```toml title="pyproject.toml" [tool.uv] conflicts = [ [ { extra = "extra1" }, { extra = "extra2" }, ], ] ``` Now, running `uv lock` will succeed. Note though, that now you cannot install both `extra1` and `extra2` at the same time: ```console $ uv sync --extra extra1 --extra extra2 Resolved 3 packages in 14ms error: extra `extra1`, extra `extra2` are incompatible with the declared conflicts: {`myproject[extra1]`, `myproject[extra2]`} ``` This error occurs because installing both `extra1` and `extra2` would result in installing two different versions of a package into the same environment. The above strategy for dealing with conflicting extras also works with dependency groups: ```toml title="pyproject.toml" [dependency-groups] group1 = ["numpy==2.1.2"] group2 = ["numpy==2.0.0"] [tool.uv] conflicts = [ [ { group = "group1" }, { group = "group2" }, ], ] ``` The only difference from conflicting extras is that you need to use the `group` key instead of `extra`. ## Lower bounds By default, `uv add` adds lower bounds to dependencies and, when using uv to manage projects, uv will warn if direct dependencies don't have lower bound. Lower bounds are not critical in the "happy path", but they are important for cases where there are dependency conflicts. For example, consider a project that requires two packages and those packages have conflicting dependencies. The resolver needs to check all combinations of all versions within the constraints for the two packages — if all of them conflict, an error is reported because the dependencies are not satisfiable. If there are no lower bounds, the resolver can (and often will) backtrack down to the oldest version of a package. This isn't only problematic because it's slow, the old version of the package often fails to build, or the resolver can end up picking a version that's old enough that it doesn't depend on the conflicting package, but also doesn't work with your code. Lower bounds are particularly critical when writing a library. It's important to declare the lowest version for each dependency that your library works with, and to validate that the bounds are correct — testing with [`--resolution lowest` or `--resolution lowest-direct`](#resolution-strategy). Otherwise, a user may receive an old, incompatible version of one of your library's dependencies and the library will fail with an unexpected error. ## Reproducible resolutions uv supports an `--exclude-newer` option to limit resolution to distributions published before a specific date, allowing reproduction of installations regardless of new package releases. The date may be specified as an [RFC 3339](https://www.rfc-editor.org/rfc/rfc3339.html) timestamp (e.g., `2006-12-02T02:07:43Z`) or a local date in the same format (e.g., `2006-12-02`) in your system's configured time zone. Note the package index must support the `upload-time` field as specified in [`PEP 700`](https://peps.python.org/pep-0700/). If the field is not present for a given distribution, the distribution will be treated as unavailable. PyPI provides `upload-time` for all packages. To ensure reproducibility, messages for unsatisfiable resolutions will not mention that distributions were excluded due to the `--exclude-newer` flag — newer distributions will be treated as if they do not exist. !!! note The `--exclude-newer` option is only applied to packages that are read from a registry (as opposed to, e.g., Git dependencies). Further, when using the `uv pip` interface, uv will not downgrade previously installed packages unless the `--reinstall` flag is provided, in which case uv will perform a new resolution. ## Source distribution [PEP 625](https://peps.python.org/pep-0625/) specifies that packages must distribute source distributions as gzip tarball (`.tar.gz`) archives. Prior to this specification, other archive formats, which need to be supported for backward compatibility, were also allowed. uv supports reading and extracting archives in the following formats: - gzip tarball (`.tar.gz`, `.tgz`) - bzip2 tarball (`.tar.bz2`, `.tbz`) - xz tarball (`.tar.xz`, `.txz`) - zstd tarball (`.tar.zst`) - lzip tarball (`.tar.lz`) - lzma tarball (`.tar.lzma`) - zip (`.zip`) ## Lockfile versioning The `uv.lock` file uses a versioned schema. The schema version is included in the `version` field of the lockfile. Any given version of uv can read and write lockfiles with the same schema version, but will reject lockfiles with a greater schema version. For example, if your uv version supports schema v1, `uv lock` will error if it encounters an existing lockfile with schema v2. uv versions that support schema v2 _may_ be able to read lockfiles with schema v1 if the schema update was backwards-compatible. However, this is not guaranteed, and uv may exit with an error if it encounters a lockfile with an outdated schema version. The schema version is considered part of the public API, and so is only bumped in minor releases, as a breaking change (see [Versioning](../reference/policies/versioning.md)). As such, all uv patch versions within a given minor uv release are guaranteed to have full lockfile compatibility. In other words, lockfiles may only be rejected across minor releases. The `revision` field of the lockfile is used to track backwards compatible changes to the lockfile. For example, adding a new field to distributions. Changes to the revision will not cause older versions of uv to error. ## Learn more For more details about the internals of the resolver, see the [resolver reference](../reference/resolver-internals.md) documentation. # Tools Tools are Python packages that provide command-line interfaces. !!! note See the [tools guide](../guides/tools.md) for an introduction to working with the tools interface — this document discusses details of tool management. ## The `uv tool` interface uv includes a dedicated interface for interacting with tools. Tools can be invoked without installation using `uv tool run`, in which case their dependencies are installed in a temporary virtual environment isolated from the current project. Because it is very common to run tools without installing them, a `uvx` alias is provided for `uv tool run` — the two commands are exactly equivalent. For brevity, the documentation will mostly refer to `uvx` instead of `uv tool run`. Tools can also be installed with `uv tool install`, in which case their executables are [available on the `PATH`](#the-path) — an isolated virtual environment is still used, but it is not removed when the command completes. ## Execution vs installation In most cases, executing a tool with `uvx` is more appropriate than installing the tool. Installing the tool is useful if you need the tool to be available to other programs on your system, e.g., if some script you do not control requires the tool, or if you are in a Docker image and want to make the tool available to users. ## Tool environments When running a tool with `uvx`, a virtual environment is stored in the uv cache directory and is treated as disposable, i.e., if you run `uv cache clean` the environment will be deleted. The environment is only cached to reduce the overhead of repeated invocations. If the environment is removed, a new one will be created automatically. When installing a tool with `uv tool install`, a virtual environment is created in the uv tools directory. The environment will not be removed unless the tool is uninstalled. If the environment is manually deleted, the tool will fail to run. ## Tool versions Unless a specific version is requested, `uv tool install` will install the latest available of the requested tool. `uvx` will use the latest available version of the requested tool _on the first invocation_. After that, `uvx` will use the cached version of the tool unless a different version is requested, the cache is pruned, or the cache is refreshed. For example, to run a specific version of Ruff: ```console $ uvx ruff@0.6.0 --version ruff 0.6.0 ``` A subsequent invocation of `uvx` will use the latest, not the cached, version. ```console $ uvx ruff --version ruff 0.6.2 ``` But, if a new version of Ruff was released, it would not be used unless the cache was refreshed. To request the latest version of Ruff and refresh the cache, use the `@latest` suffix: ```console $ uvx ruff@latest --version 0.6.2 ``` Once a tool is installed with `uv tool install`, `uvx` will use the installed version by default. For example, after installing an older version of Ruff: ```console $ uv tool install ruff==0.5.0 ``` The version of `ruff` and `uvx ruff` is the same: ```console $ ruff --version ruff 0.5.0 $ uvx ruff --version ruff 0.5.0 ``` However, you can ignore the installed version by requesting the latest version explicitly, e.g.: ```console $ uvx ruff@latest --version 0.6.2 ``` Or, by using the `--isolated` flag, which will avoid refreshing the cache but ignore the installed version: ```console $ uvx --isolated ruff --version 0.6.2 ``` `uv tool install` will also respect the `{package}@{version}` and `{package}@latest` specifiers, as in: ```console $ uv tool install ruff@latest $ uv tool install ruff@0.6.0 ``` ## Tools directory By default, the uv tools directory is named `tools` and is in the uv application state directory, e.g., `~/.local/share/uv/tools`. The location may be customized with the `UV_TOOL_DIR` environment variable. To display the path to the tool installation directory: ```console $ uv tool dir ``` Tool environments are placed in a directory with the same name as the tool package, e.g., `.../tools/`. !!! important Tool environments are _not_ intended to be mutated directly. It is strongly recommended never to mutate a tool environment manually, e.g., with a `pip` operation. ## Upgrading tools Tool environments may be upgraded via `uv tool upgrade`, or re-created entirely via subsequent `uv tool install` operations. To upgrade all packages in a tool environment ```console $ uv tool upgrade black ``` To upgrade a single package in a tool environment: ```console $ uv tool upgrade black --upgrade-package click ``` Tool upgrades will respect the version constraints provided when installing the tool. For example, `uv tool install black >=23,<24` followed by `uv tool upgrade black` will upgrade Black to the latest version in the range `>=23,<24`. To instead replace the version constraints, reinstall the tool with `uv tool install`: ```console $ uv tool install black>=24 ``` Similarly, tool upgrades will retain the settings provided when installing the tool. For example, `uv tool install black --prerelease allow` followed by `uv tool upgrade black` will retain the `--prerelease allow` setting. !!! note Tool upgrades will reinstall the tool executables, even if they have not changed. To reinstall packages during upgrade, use the `--reinstall` and `--reinstall-package` options. To reinstall all packages in a tool environment ```console $ uv tool upgrade black --reinstall ``` To reinstall a single package in a tool environment: ```console $ uv tool upgrade black --reinstall-package click ``` ## Including additional dependencies Additional packages can be included during tool execution: ```console $ uvx --with ``` And, during tool installation: ```console $ uv tool install --with ``` The `--with` option can be provided multiple times to include additional packages. The `--with` option supports package specifications, so a specific version can be requested: ```console $ uvx --with == ``` The `-w` shorthand can be used in place of the `--with` option: ```console $ uvx -w ``` If the requested version conflicts with the requirements of the tool package, package resolution will fail and the command will error. ## Python versions Each tool environment is linked to a specific Python version. This uses the same Python version [discovery logic](./python-versions.md#discovery-of-python-versions) as other virtual environments created by uv, but will ignore non-global Python version requests like `.python-version` files and the `requires-python` value from a `pyproject.toml`. The `--python` option can be used to request a specific version. See the [Python version](./python-versions.md) documentation for more details. If the Python version used by a tool is _uninstalled_, the tool environment will be broken and the tool may be unusable. ## Tool executables Tool executables include all console entry points, script entry points, and binary scripts provided by a Python package. Tool executables are symlinked into the `bin` directory on Unix and copied on Windows. ### The `bin` directory Executables are installed into the user `bin` directory following the XDG standard, e.g., `~/.local/bin`. Unlike other directory schemes in uv, the XDG standard is used on _all platforms_ notably including Windows and macOS — there is no clear alternative location to place executables on these platforms. The installation directory is determined from the first available environment variable: - `$UV_TOOL_BIN_DIR` - `$XDG_BIN_HOME` - `$XDG_DATA_HOME/../bin` - `$HOME/.local/bin` Executables provided by dependencies of tool packages are not installed. ### The `PATH` The `bin` directory must be in the `PATH` variable for tool executables to be available from the shell. If it is not in the `PATH`, a warning will be displayed. The `uv tool update-shell` command can be used to add the `bin` directory to the `PATH` in common shell configuration files. ### Overwriting executables Installation of tools will not overwrite executables in the `bin` directory that were not previously installed by uv. For example, if `pipx` has been used to install a tool, `uv tool install` will fail. The `--force` flag can be used to override this behavior. ## Relationship to `uv run` The invocation `uv tool run ` (or `uvx `) is nearly equivalent to: ```console $ uv run --no-project --with -- ``` However, there are a couple notable differences when using uv's tool interface: - The `--with` option is not needed — the required package is inferred from the command name. - The temporary environment is cached in a dedicated location. - The `--no-project` flag is not needed — tools are always run isolated from the project. - If a tool is already installed, `uv tool run` will use the installed version but `uv run` will not. If the tool should not be isolated from the project, e.g., when running `pytest` or `mypy`, then `uv run` should be used instead of `uv tool run`. # Features uv provides essential features for Python development — from installing Python and hacking on simple scripts to working on large projects that support multiple Python versions and platforms. uv's interface can be broken down into sections, which are usable independently or together. ## Python versions Installing and managing Python itself. - `uv python install`: Install Python versions. - `uv python list`: View available Python versions. - `uv python find`: Find an installed Python version. - `uv python pin`: Pin the current project to use a specific Python version. - `uv python uninstall`: Uninstall a Python version. See the [guide on installing Python](../guides/install-python.md) to get started. ## Scripts Executing standalone Python scripts, e.g., `example.py`. - `uv run`: Run a script. - `uv add --script`: Add a dependency to a script - `uv remove --script`: Remove a dependency from a script See the [guide on running scripts](../guides/scripts.md) to get started. ## Projects Creating and working on Python projects, i.e., with a `pyproject.toml`. - `uv init`: Create a new Python project. - `uv add`: Add a dependency to the project. - `uv remove`: Remove a dependency from the project. - `uv sync`: Sync the project's dependencies with the environment. - `uv lock`: Create a lockfile for the project's dependencies. - `uv run`: Run a command in the project environment. - `uv tree`: View the dependency tree for the project. - `uv build`: Build the project into distribution archives. - `uv publish`: Publish the project to a package index. See the [guide on projects](../guides/projects.md) to get started. ## Tools Running and installing tools published to Python package indexes, e.g., `ruff` or `black`. - `uvx` / `uv tool run`: Run a tool in a temporary environment. - `uv tool install`: Install a tool user-wide. - `uv tool uninstall`: Uninstall a tool. - `uv tool list`: List installed tools. - `uv tool update-shell`: Update the shell to include tool executables. See the [guide on tools](../guides/tools.md) to get started. ## The pip interface Manually managing environments and packages — intended to be used in legacy workflows or cases where the high-level commands do not provide enough control. Creating virtual environments (replacing `venv` and `virtualenv`): - `uv venv`: Create a new virtual environment. See the documentation on [using environments](../pip/environments.md) for details. Managing packages in an environment (replacing [`pip`](https://github.com/pypa/pip) and [`pipdeptree`](https://github.com/tox-dev/pipdeptree)): - `uv pip install`: Install packages into the current environment. - `uv pip show`: Show details about an installed package. - `uv pip freeze`: List installed packages and their versions. - `uv pip check`: Check that the current environment has compatible packages. - `uv pip list`: List installed packages. - `uv pip uninstall`: Uninstall packages. - `uv pip tree`: View the dependency tree for the environment. See the documentation on [managing packages](../pip/packages.md) for details. Locking packages in an environment (replacing [`pip-tools`](https://github.com/jazzband/pip-tools)): - `uv pip compile`: Compile requirements into a lockfile. - `uv pip sync`: Sync an environment with a lockfile. See the documentation on [locking environments](../pip/compile.md) for details. !!! important These commands do not exactly implement the interfaces and behavior of the tools they are based on. The further you stray from common workflows, the more likely you are to encounter differences. Consult the [pip-compatibility guide](../pip/compatibility.md) for details. ## Utility Managing and inspecting uv's state, such as the cache, storage directories, or performing a self-update: - `uv cache clean`: Remove cache entries. - `uv cache prune`: Remove outdated cache entries. - `uv cache dir`: Show the uv cache directory path. - `uv tool dir`: Show the uv tool directory path. - `uv python dir`: Show the uv installed Python versions path. - `uv self update`: Update uv to the latest version. ## Next steps Read the [guides](../guides/index.md) for an introduction to each feature, check out the [concept](../concepts/index.md) pages for in-depth details about uv's features, or learn how to [get help](./help.md) if you run into any problems. # First steps with uv After [installing uv](./installation.md), you can check that uv is available by running the `uv` command: ```console $ uv An extremely fast Python package manager. Usage: uv [OPTIONS] ... ``` You should see a help menu listing the available commands. ## Next steps Now that you've confirmed uv is installed, check out an [overview of features](./features.md), learn how to [get help](./help.md) if you run into any problems, or jump to the [guides](../guides/index.md) to start using uv. # Getting help ## Help menus The `--help` flag can be used to view the help menu for a command, e.g., for `uv`: ```console $ uv --help ``` To view the help menu for a specific command, e.g., for `uv init`: ```console $ uv init --help ``` When using the `--help` flag, uv displays a condensed help menu. To view a longer help menu for a command, use `uv help`: ```console $ uv help ``` To view the long help menu for a specific command, e.g., for `uv init`: ```console $ uv help init ``` When using the long help menu, uv will attempt to use `less` or `more` to "page" the output so it is not all displayed at once. To exit the pager, press `q`. ## Viewing the version When seeking help, it's important to determine the version of uv that you're using — sometimes the problem is already solved in a newer version. To check the installed version: ```console $ uv self version ``` The following are also valid: ```console $ uv --version # Same output as `uv self version` $ uv -V # Will not include the build commit and date ``` !!! note Before uv 0.7.0, `uv version` was used instead of `uv self version`. ## Troubleshooting issues The reference documentation contains a [troubleshooting guide](../reference/troubleshooting/index.md) for common issues. ## Open an issue on GitHub The [issue tracker](https://github.com/astral-sh/uv/issues) on GitHub is a good place to report bugs and request features. Make sure to search for similar issues first, as it is common for someone else to encounter the same problem. ## Chat on Discord Astral has a [Discord server](https://discord.com/invite/astral-sh), which is a great place to ask questions, learn more about uv, and engage with other community members. # Getting started To help you get started with uv, we'll cover a few important topics: - [Installing uv](./installation.md) - [First steps after installation](./first-steps.md) - [An overview of uv's features](./features.md) - [How to get help](./help.md) Read on, or jump ahead to another section: - Get going quickly with [guides](../guides/index.md) for common workflows. - Learn more about the core [concepts](../concepts/index.md) in uv. - Use the [reference](../reference/index.md) documentation to find details about something specific. # Installing uv ## Installation methods Install uv with our standalone installers or your package manager of choice. ### Standalone installer uv provides a standalone installer to download and install uv: === "macOS and Linux" Use `curl` to download the script and execute it with `sh`: ```console $ curl -LsSf https://astral.sh/uv/install.sh | sh ``` If your system doesn't have `curl`, you can use `wget`: ```console $ wget -qO- https://astral.sh/uv/install.sh | sh ``` Request a specific version by including it in the URL: ```console $ curl -LsSf https://astral.sh/uv/0.8.4/install.sh | sh ``` === "Windows" Use `irm` to download the script and execute it with `iex`: ```pwsh-session PS> powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex" ``` Changing the [execution policy](https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.core/about/about_execution_policies?view=powershell-7.4#powershell-execution-policies) allows running a script from the internet. Request a specific version by including it in the URL: ```pwsh-session PS> powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/0.8.4/install.ps1 | iex" ``` !!! tip The installation script may be inspected before use: === "macOS and Linux" ```console $ curl -LsSf https://astral.sh/uv/install.sh | less ``` === "Windows" ```pwsh-session PS> powershell -c "irm https://astral.sh/uv/install.ps1 | more" ``` Alternatively, the installer or binaries can be downloaded directly from [GitHub](#github-releases). See the reference documentation on the [installer](../reference/installer.md) for details on customizing your uv installation. ### PyPI For convenience, uv is published to [PyPI](https://pypi.org/project/uv/). If installing from PyPI, we recommend installing uv into an isolated environment, e.g., with `pipx`: ```console $ pipx install uv ``` However, `pip` can also be used: ```console $ pip install uv ``` !!! note uv ships with prebuilt distributions (wheels) for many platforms; if a wheel is not available for a given platform, uv will be built from source, which requires a Rust toolchain. See the [contributing setup guide](https://github.com/astral-sh/uv/blob/main/CONTRIBUTING.md#setup) for details on building uv from source. ### Homebrew uv is available in the core Homebrew packages. ```console $ brew install uv ``` ### WinGet uv is available via [WinGet](https://winstall.app/apps/astral-sh.uv). ```console $ winget install --id=astral-sh.uv -e ``` ### Scoop uv is available via [Scoop](https://scoop.sh/#/apps?q=uv). ```console $ scoop install main/uv ``` ### Docker uv provides a Docker image at [`ghcr.io/astral-sh/uv`](https://github.com/astral-sh/uv/pkgs/container/uv). See our guide on [using uv in Docker](../guides/integration/docker.md) for more details. ### GitHub Releases uv release artifacts can be downloaded directly from [GitHub Releases](https://github.com/astral-sh/uv/releases). Each release page includes binaries for all supported platforms as well as instructions for using the standalone installer via `github.com` instead of `astral.sh`. ### Cargo uv is available via Cargo, but must be built from Git rather than [crates.io](https://crates.io) due to its dependency on unpublished crates. ```console $ cargo install --git https://github.com/astral-sh/uv uv ``` !!! note This method builds uv from source, which requires a compatible Rust toolchain. ## Upgrading uv When uv is installed via the standalone installer, it can update itself on-demand: ```console $ uv self update ``` !!! tip Updating uv will re-run the installer and can modify your shell profiles. To disable this behavior, set `UV_NO_MODIFY_PATH=1`. When another installation method is used, self-updates are disabled. Use the package manager's upgrade method instead. For example, with `pip`: ```console $ pip install --upgrade uv ``` ## Shell autocompletion !!! tip You can run `echo $SHELL` to help you determine your shell. To enable shell autocompletion for uv commands, run one of the following: === "Bash" ```bash echo 'eval "$(uv generate-shell-completion bash)"' >> ~/.bashrc ``` === "Zsh" ```bash echo 'eval "$(uv generate-shell-completion zsh)"' >> ~/.zshrc ``` === "fish" ```bash echo 'uv generate-shell-completion fish | source' > ~/.config/fish/completions/uv.fish ``` === "Elvish" ```bash echo 'eval (uv generate-shell-completion elvish | slurp)' >> ~/.elvish/rc.elv ``` === "PowerShell / pwsh" ```powershell if (!(Test-Path -Path $PROFILE)) { New-Item -ItemType File -Path $PROFILE -Force } Add-Content -Path $PROFILE -Value '(& uv generate-shell-completion powershell) | Out-String | Invoke-Expression' ``` To enable shell autocompletion for uvx, run one of the following: === "Bash" ```bash echo 'eval "$(uvx --generate-shell-completion bash)"' >> ~/.bashrc ``` === "Zsh" ```bash echo 'eval "$(uvx --generate-shell-completion zsh)"' >> ~/.zshrc ``` === "fish" ```bash echo 'uvx --generate-shell-completion fish | source' > ~/.config/fish/completions/uvx.fish ``` === "Elvish" ```bash echo 'eval (uvx --generate-shell-completion elvish | slurp)' >> ~/.elvish/rc.elv ``` === "PowerShell / pwsh" ```powershell if (!(Test-Path -Path $PROFILE)) { New-Item -ItemType File -Path $PROFILE -Force } Add-Content -Path $PROFILE -Value '(& uvx --generate-shell-completion powershell) | Out-String | Invoke-Expression' ``` Then restart the shell or source the shell config file. ## Uninstallation If you need to remove uv from your system, follow these steps: 1. Clean up stored data (optional): ```console $ uv cache clean $ rm -r "$(uv python dir)" $ rm -r "$(uv tool dir)" ``` !!! tip Before removing the binaries, you may want to remove any data that uv has stored. 2. Remove the uv and uvx binaries: === "macOS and Linux" ```console $ rm ~/.local/bin/uv ~/.local/bin/uvx ``` === "Windows" ```pwsh-session PS> rm $HOME\.local\bin\uv.exe PS> rm $HOME\.local\bin\uvx.exe ``` !!! note Prior to 0.5.0, uv was installed into `~/.cargo/bin`. The binaries can be removed from there to uninstall. Upgrading from an older version will not automatically remove the binaries from `~/.cargo/bin`. ## Next steps See the [first steps](./first-steps.md) or jump straight to the [guides](../guides/index.md) to start using uv. # Guides overview Check out one of the core guides to get started: - [Installing Python versions](./install-python.md) - [Running scripts and declaring dependencies](./scripts.md) - [Running and installing applications as tools](./tools.md) - [Creating and working on projects](./projects.md) - [Building and publishing packages](./package.md) - [Integrate uv with other software, e.g., Docker, GitHub, PyTorch, and more](./integration/index.md) Or, explore the [concept documentation](../concepts/index.md) for comprehensive breakdown of each feature. --- title: Installing and managing Python description: A guide to using uv to install Python, including requesting specific versions, automatic installation, viewing installed versions, and more. --- # Installing Python If Python is already installed on your system, uv will [detect and use](#using-existing-python-versions) it without configuration. However, uv can also install and manage Python versions. uv [automatically installs](#automatic-python-downloads) missing Python versions as needed — you don't need to install Python to get started. ## Getting started To install the latest Python version: ```console $ uv python install ``` !!! note Python does not publish official distributable binaries. As such, uv uses distributions from the Astral [`python-build-standalone`](https://github.com/astral-sh/python-build-standalone) project. See the [Python distributions](../concepts/python-versions.md#managed-python-distributions) documentation for more details. Once Python is installed, it will be used by `uv` commands automatically. uv also adds the installed version to your `PATH`: ```console $ python3.13 ``` uv only installs a _versioned_ executable by default. To install `python` and `python3` executables, include the experimental `--default` option: ```console $ uv python install --default ``` !!! tip See the documentation on [installing Python executables](../concepts/python-versions.md#installing-python-executables) for more details. ## Installing a specific version To install a specific Python version: ```console $ uv python install 3.12 ``` To install multiple Python versions: ```console $ uv python install 3.11 3.12 ``` To install an alternative Python implementation, e.g., PyPy: ```console $ uv python install pypy@3.10 ``` See the [`python install`](../concepts/python-versions.md#installing-a-python-version) documentation for more details. ## Reinstalling Python To reinstall uv-managed Python versions, use `--reinstall`, e.g.: ```console $ uv python install --reinstall ``` This will reinstall all previously installed Python versions. Improvements are constantly being added to the Python distributions, so reinstalling may resolve bugs even if the Python version does not change. ## Viewing Python installations To view available and installed Python versions: ```console $ uv python list ``` See the [`python list`](../concepts/python-versions.md#viewing-available-python-versions) documentation for more details. ## Automatic Python downloads Python does not need to be explicitly installed to use uv. By default, uv will automatically download Python versions when they are required. For example, the following would download Python 3.12 if it was not installed: ```console $ uvx python@3.12 -c "print('hello world')" ``` Even if a specific Python version is not requested, uv will download the latest version on demand. For example, if there are no Python versions on your system, the following will install Python before creating a new virtual environment: ```console $ uv venv ``` !!! tip Automatic Python downloads can be [easily disabled](../concepts/python-versions.md#disabling-automatic-python-downloads) if you want more control over when Python is downloaded. ## Using existing Python versions uv will use existing Python installations if present on your system. There is no configuration necessary for this behavior: uv will use the system Python if it satisfies the requirements of the command invocation. See the [Python discovery](../concepts/python-versions.md#discovery-of-python-versions) documentation for details. To force uv to use the system Python, provide the `--no-managed-python` flag. See the [Python version preference](../concepts/python-versions.md#requiring-or-disabling-managed-python-versions) documentation for more details. ## Upgrading Python versions !!! important Support for upgrading Python patch versions is in _preview_. This means the behavior is experimental and subject to change. To upgrade a Python version to the latest supported patch release: ```console $ uv python upgrade 3.12 ``` To upgrade all uv-managed Python versions: ```console $ uv python upgrade ``` See the [`python upgrade`](../concepts/python-versions.md#upgrading-python-versions) documentation for more details. ## Next steps To learn more about `uv python`, see the [Python version concept](../concepts/python-versions.md) page and the [command reference](../reference/cli.md#uv-python). Or, read on to learn how to [run scripts](./scripts.md) and invoke Python with uv. --- title: Using alternative package indexes description: A guide to using alternative package indexes with uv, including Azure Artifacts, Google Artifact Registry, AWS CodeArtifact, and more. --- # Using alternative package indexes While uv uses the official Python Package Index (PyPI) by default, it also supports [alternative package indexes](../../concepts/indexes.md). Most alternative indexes require various forms of authentication, which require some initial setup. !!! important If using the pip interface, please read the documentation on [using multiple indexes](../../pip/compatibility.md#packages-that-exist-on-multiple-indexes) in uv — the default behavior is different from pip to prevent dependency confusion attacks, but this means that uv may not find the versions of a package as you'd expect. ## Azure Artifacts uv can install packages from [Azure Artifacts](https://learn.microsoft.com/en-us/azure/devops/artifacts/start-using-azure-artifacts?view=azure-devops&tabs=nuget%2Cnugetserver), either by using a [Personal Access Token](https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/use-personal-access-tokens-to-authenticate?view=azure-devops&tabs=Windows) (PAT), or using the [`keyring`](https://github.com/jaraco/keyring) package. To use Azure Artifacts, add the index to your project: ```toml title="pyproject.toml" [[tool.uv.index]] name = "private-registry" url = "https://pkgs.dev.azure.com///_packaging//pypi/simple/" ``` ### Authenticate with an Azure access token If there is a personal access token (PAT) available (e.g., [`$(System.AccessToken)` in an Azure pipeline](https://learn.microsoft.com/en-us/azure/devops/pipelines/build/variables?view=azure-devops&tabs=yaml#systemaccesstoken)), credentials can be provided via "Basic" HTTP authentication scheme. Include the PAT in the password field of the URL. A username must be included as well, but can be any string. For example, with the token stored in the `$AZURE_ARTIFACTS_TOKEN` environment variable, set credentials for the index with: ```bash export UV_INDEX_PRIVATE_REGISTRY_USERNAME=dummy export UV_INDEX_PRIVATE_REGISTRY_PASSWORD="$AZURE_ARTIFACTS_TOKEN" ``` !!! note `PRIVATE_REGISTRY` should match the name of the index defined in your `pyproject.toml`. ### Authenticate with `keyring` and `artifacts-keyring` You can also authenticate to Artifacts using [`keyring`](https://github.com/jaraco/keyring) package with the [`artifacts-keyring` plugin](https://github.com/Microsoft/artifacts-keyring). Because these two packages are required to authenticate to Azure Artifacts, they must be pre-installed from a source other than Artifacts. The `artifacts-keyring` plugin wraps the [Azure Artifacts Credential Provider tool](https://github.com/microsoft/artifacts-credprovider). The credential provider supports a few different authentication modes including interactive login — see the [tool's documentation](https://github.com/microsoft/artifacts-credprovider) for information on configuration. uv only supports using the `keyring` package in [subprocess mode](../../reference/settings.md#keyring-provider). The `keyring` executable must be in the `PATH`, i.e., installed globally or in the active environment. The `keyring` CLI requires a username in the URL, and it must be `VssSessionToken`. ```bash # Pre-install keyring and the Artifacts plugin from the public PyPI uv tool install keyring --with artifacts-keyring # Enable keyring authentication export UV_KEYRING_PROVIDER=subprocess # Set the username for the index export UV_INDEX_PRIVATE_REGISTRY_USERNAME=VssSessionToken ``` !!! note The [`tool.uv.keyring-provider`](../../reference/settings.md#keyring-provider) setting can be used to enable keyring in your `uv.toml` or `pyproject.toml`. Similarly, the username for the index can be added directly to the index URL. ### Publishing packages to Azure Artifacts If you also want to publish your own packages to Azure Artifacts, you can use `uv publish` as described in the [Building and publishing guide](../package.md). First, add a `publish-url` to the index you want to publish packages to. For example: ```toml title="pyproject.toml" hl_lines="4" [[tool.uv.index]] name = "private-registry" url = "https://pkgs.dev.azure.com///_packaging//pypi/simple/" publish-url = "https://pkgs.dev.azure.com///_packaging//pypi/upload/" ``` Then, configure credentials (if not using keyring): ```console $ export UV_PUBLISH_USERNAME=dummy $ export UV_PUBLISH_PASSWORD="$AZURE_ARTIFACTS_TOKEN" ``` And publish the package: ```console $ uv publish --index private-registry ``` To use `uv publish` without adding the `publish-url` to the project, you can set `UV_PUBLISH_URL`: ```console $ export UV_PUBLISH_URL=https://pkgs.dev.azure.com///_packaging//pypi/upload/ $ uv publish ``` Note this method is not preferable because uv cannot check if the package is already published before uploading artifacts. ## Google Artifact Registry uv can install packages from [Google Artifact Registry](https://cloud.google.com/artifact-registry/docs), either by using an access token, or using the [`keyring`](https://github.com/jaraco/keyring) package. !!! note This guide assumes that [`gcloud`](https://cloud.google.com/sdk/gcloud) CLI is installed and authenticated. To use Google Artifact Registry, add the index to your project: ```toml title="pyproject.toml" [[tool.uv.index]] name = "private-registry" url = "https://-python.pkg.dev///simple/" ``` ### Authenticate with a Google access token Credentials can be provided via "Basic" HTTP authentication scheme. Include access token in the password field of the URL. Username must be `oauth2accesstoken`, otherwise authentication will fail. Generate a token with `gcloud`: ```bash export ARTIFACT_REGISTRY_TOKEN=$( gcloud auth application-default print-access-token ) ``` !!! note You might need to pass extra parameters to properly generate the token (like `--project`), this is a basic example. Then set credentials for the index with: ```bash export UV_INDEX_PRIVATE_REGISTRY_USERNAME=oauth2accesstoken export UV_INDEX_PRIVATE_REGISTRY_PASSWORD="$ARTIFACT_REGISTRY_TOKEN" ``` !!! note `PRIVATE_REGISTRY` should match the name of the index defined in your `pyproject.toml`. ### Authenticate with `keyring` and `keyrings.google-artifactregistry-auth` You can also authenticate to Artifact Registry using [`keyring`](https://github.com/jaraco/keyring) package with the [`keyrings.google-artifactregistry-auth` plugin](https://github.com/GoogleCloudPlatform/artifact-registry-python-tools). Because these two packages are required to authenticate to Artifact Registry, they must be pre-installed from a source other than Artifact Registry. The `keyrings.google-artifactregistry-auth` plugin wraps [gcloud CLI](https://cloud.google.com/sdk/gcloud) to generate short-lived access tokens, securely store them in system keyring, and refresh them when they are expired. uv only supports using the `keyring` package in [subprocess mode](../../reference/settings.md#keyring-provider). The `keyring` executable must be in the `PATH`, i.e., installed globally or in the active environment. The `keyring` CLI requires a username in the URL and it must be `oauth2accesstoken`. ```bash # Pre-install keyring and Artifact Registry plugin from the public PyPI uv tool install keyring --with keyrings.google-artifactregistry-auth # Enable keyring authentication export UV_KEYRING_PROVIDER=subprocess # Set the username for the index export UV_INDEX_PRIVATE_REGISTRY_USERNAME=oauth2accesstoken ``` !!! note The [`tool.uv.keyring-provider`](../../reference/settings.md#keyring-provider) setting can be used to enable keyring in your `uv.toml` or `pyproject.toml`. Similarly, the username for the index can be added directly to the index URL. ### Publishing packages to Google Artifact Registry If you also want to publish your own packages to Google Artifact Registry, you can use `uv publish` as described in the [Building and publishing guide](../package.md). First, add a `publish-url` to the index you want to publish packages to. For example: ```toml title="pyproject.toml" hl_lines="4" [[tool.uv.index]] name = "private-registry" url = "https://-python.pkg.dev///simple/" publish-url = "https://-python.pkg.dev///" ``` Then, configure credentials (if not using keyring): ```console $ export UV_PUBLISH_USERNAME=oauth2accesstoken $ export UV_PUBLISH_PASSWORD="$ARTIFACT_REGISTRY_TOKEN" ``` And publish the package: ```console $ uv publish --index private-registry ``` To use `uv publish` without adding the `publish-url` to the project, you can set `UV_PUBLISH_URL`: ```console $ export UV_PUBLISH_URL=https://-python.pkg.dev/// $ uv publish ``` Note this method is not preferable because uv cannot check if the package is already published before uploading artifacts. ## AWS CodeArtifact uv can install packages from [AWS CodeArtifact](https://docs.aws.amazon.com/codeartifact/latest/ug/using-python.html), either by using an access token, or using the [`keyring`](https://github.com/jaraco/keyring) package. !!! note This guide assumes that [`awscli`](https://aws.amazon.com/cli/) is installed and authenticated. The index can be declared like so: ```toml title="pyproject.toml" [[tool.uv.index]] name = "private-registry" url = "https://-.d.codeartifact..amazonaws.com/pypi//simple/" ``` ### Authenticate with an AWS access token Credentials can be provided via "Basic" HTTP authentication scheme. Include access token in the password field of the URL. Username must be `aws`, otherwise authentication will fail. Generate a token with `awscli`: ```bash export AWS_CODEARTIFACT_TOKEN="$( aws codeartifact get-authorization-token \ --domain \ --domain-owner \ --query authorizationToken \ --output text )" ``` !!! note You might need to pass extra parameters to properly generate the token (like `--region`), this is a basic example. Then set credentials for the index with: ```bash export UV_INDEX_PRIVATE_REGISTRY_USERNAME=aws export UV_INDEX_PRIVATE_REGISTRY_PASSWORD="$AWS_CODEARTIFACT_TOKEN" ``` !!! note `PRIVATE_REGISTRY` should match the name of the index defined in your `pyproject.toml`. ### Authenticate with `keyring` and `keyrings.codeartifact` You can also authenticate to Artifact Registry using [`keyring`](https://github.com/jaraco/keyring) package with the [`keyrings.codeartifact` plugin](https://github.com/jmkeyes/keyrings.codeartifact). Because these two packages are required to authenticate to Artifact Registry, they must be pre-installed from a source other than Artifact Registry. The `keyrings.codeartifact` plugin wraps [boto3](https://pypi.org/project/boto3/) to generate short-lived access tokens, securely store them in system keyring, and refresh them when they are expired. uv only supports using the `keyring` package in [subprocess mode](../../reference/settings.md#keyring-provider). The `keyring` executable must be in the `PATH`, i.e., installed globally or in the active environment. The `keyring` CLI requires a username in the URL and it must be `aws`. ```bash # Pre-install keyring and AWS CodeArtifact plugin from the public PyPI uv tool install keyring --with keyrings.codeartifact # Enable keyring authentication export UV_KEYRING_PROVIDER=subprocess # Set the username for the index export UV_INDEX_PRIVATE_REGISTRY_USERNAME=aws ``` !!! note The [`tool.uv.keyring-provider`](../../reference/settings.md#keyring-provider) setting can be used to enable keyring in your `uv.toml` or `pyproject.toml`. Similarly, the username for the index can be added directly to the index URL. ### Publishing packages to AWS CodeArtifact If you also want to publish your own packages to AWS CodeArtifact, you can use `uv publish` as described in the [Building and publishing guide](../package.md). First, add a `publish-url` to the index you want to publish packages to. For example: ```toml title="pyproject.toml" hl_lines="4" [[tool.uv.index]] name = "private-registry" url = "https://-.d.codeartifact..amazonaws.com/pypi//simple/" publish-url = "https://-.d.codeartifact..amazonaws.com/pypi//" ``` Then, configure credentials (if not using keyring): ```console $ export UV_PUBLISH_USERNAME=aws $ export UV_PUBLISH_PASSWORD="$AWS_CODEARTIFACT_TOKEN" ``` And publish the package: ```console $ uv publish --index private-registry ``` To use `uv publish` without adding the `publish-url` to the project, you can set `UV_PUBLISH_URL`: ```console $ export UV_PUBLISH_URL=https://-.d.codeartifact..amazonaws.com/pypi// $ uv publish ``` Note this method is not preferable because uv cannot check if the package is already published before uploading artifacts. ## JFrog Artifactory uv can install packages from JFrog Artifactory, either by using a username and password or a JWT token. To use it, add the index to your project: ```toml title="pyproject.toml" [[tool.uv.index]] name = "private-registry" url = "https://.jfrog.io/artifactory/api/pypi//simple" ``` ### Authenticate with username and password ```console $ export UV_INDEX_PRIVATE_REGISTRY_USERNAME="" $ export UV_INDEX_PRIVATE_REGISTRY_PASSWORD="" ``` ### Authenticate with JWT token ```console $ export UV_INDEX_PRIVATE_REGISTRY_USERNAME="" $ export UV_INDEX_PRIVATE_REGISTRY_PASSWORD="$JFROG_JWT_TOKEN" ``` !!! note Replace `PRIVATE_REGISTRY` in the environment variable names with the actual index name defined in your `pyproject.toml`. ### Publishing packages to JFrog Artifactory Add a `publish-url` to your index definition: ```toml title="pyproject.toml" [[tool.uv.index]] name = "private-registry" url = "https://.jfrog.io/artifactory/api/pypi//simple" publish-url = "https://.jfrog.io/artifactory/api/pypi/" ``` !!! important If you use `--token "$JFROG_TOKEN"` or `UV_PUBLISH_TOKEN` with JFrog, you will receive a 401 Unauthorized error as JFrog requires an empty username but uv passes `__token__` for as the username when `--token` is used. To authenticate, pass your token as the password and set the username to an empty string: ```console $ uv publish --index -u "" -p "$JFROG_TOKEN" ``` Alternatively, you can set environment variables: ```console $ export UV_PUBLISH_USERNAME="" $ export UV_PUBLISH_PASSWORD="$JFROG_TOKEN" $ uv publish --index private-registry ``` !!! note The publish environment variables (`UV_PUBLISH_USERNAME` and `UV_PUBLISH_PASSWORD`) do not include the index name. --- title: Using uv with AWS Lambda description: A complete guide to using uv with AWS Lambda to manage Python dependencies and deploy serverless functions via Docker containers or zip archives. --- # Using uv with AWS Lambda [AWS Lambda](https://aws.amazon.com/lambda/) is a serverless computing service that lets you run code without provisioning or managing servers. You can use uv with AWS Lambda to manage your Python dependencies, build your deployment package, and deploy your Lambda functions. !!! tip Check out the [`uv-aws-lambda-example`](https://github.com/astral-sh/uv-aws-lambda-example) project for an example of best practices when using uv to deploy an application to AWS Lambda. ## Getting started To start, assume we have a minimal FastAPI application with the following structure: ```plaintext project ├── pyproject.toml └── app ├── __init__.py └── main.py ``` Where the `pyproject.toml` contains: ```toml title="pyproject.toml" [project] name = "uv-aws-lambda-example" version = "0.1.0" requires-python = ">=3.13" dependencies = [ # FastAPI is a modern web framework for building APIs with Python. "fastapi", # Mangum is a library that adapts ASGI applications to AWS Lambda and API Gateway. "mangum", ] [dependency-groups] dev = [ # In development mode, include the FastAPI development server. "fastapi[standard]>=0.115", ] ``` And the `main.py` file contains: ```python title="app/main.py" import logging from fastapi import FastAPI from mangum import Mangum logger = logging.getLogger() logger.setLevel(logging.INFO) app = FastAPI() handler = Mangum(app) @app.get("/") async def root() -> str: return "Hello, world!" ``` We can run this application locally with: ```console $ uv run fastapi dev ``` From there, opening http://127.0.0.1:8000/ in a web browser will display "Hello, world!" ## Deploying a Docker image To deploy to AWS Lambda, we need to build a container image that includes the application code and dependencies in a single output directory. We'll follow the principles outlined in the [Docker guide](./docker.md) (in particular, a multi-stage build) to ensure that the final image is as small and cache-friendly as possible. In the first stage, we'll populate a single directory with all application code and dependencies. In the second stage, we'll copy this directory over to the final image, omitting the build tools and other unnecessary files. ```dockerfile title="Dockerfile" FROM ghcr.io/astral-sh/uv:0.8.4 AS uv # First, bundle the dependencies into the task root. FROM public.ecr.aws/lambda/python:3.13 AS builder # Enable bytecode compilation, to improve cold-start performance. ENV UV_COMPILE_BYTECODE=1 # Disable installer metadata, to create a deterministic layer. ENV UV_NO_INSTALLER_METADATA=1 # Enable copy mode to support bind mount caching. ENV UV_LINK_MODE=copy # Bundle the dependencies into the Lambda task root via `uv pip install --target`. # # Omit any local packages (`--no-emit-workspace`) and development dependencies (`--no-dev`). # This ensures that the Docker layer cache is only invalidated when the `pyproject.toml` or `uv.lock` # files change, but remains robust to changes in the application code. RUN --mount=from=uv,source=/uv,target=/bin/uv \ --mount=type=cache,target=/root/.cache/uv \ --mount=type=bind,source=uv.lock,target=uv.lock \ --mount=type=bind,source=pyproject.toml,target=pyproject.toml \ uv export --frozen --no-emit-workspace --no-dev --no-editable -o requirements.txt && \ uv pip install -r requirements.txt --target "${LAMBDA_TASK_ROOT}" FROM public.ecr.aws/lambda/python:3.13 # Copy the runtime dependencies from the builder stage. COPY --from=builder ${LAMBDA_TASK_ROOT} ${LAMBDA_TASK_ROOT} # Copy the application code. COPY ./app ${LAMBDA_TASK_ROOT}/app # Set the AWS Lambda handler. CMD ["app.main.handler"] ``` !!! tip To deploy to ARM-based AWS Lambda runtimes, replace `public.ecr.aws/lambda/python:3.13` with `public.ecr.aws/lambda/python:3.13-arm64`. We can build the image with, e.g.: ```console $ uv lock $ docker build -t fastapi-app . ``` The core benefits of this Dockerfile structure are as follows: 1. **Minimal image size.** By using a multi-stage build, we can ensure that the final image only includes the application code and dependencies. For example, the uv binary itself is not included in the final image. 2. **Maximal cache reuse.** By installing application dependencies separately from the application code, we can ensure that the Docker layer cache is only invalidated when the dependencies change. Concretely, rebuilding the image after modifying the application source code can reuse the cached layers, resulting in millisecond builds: ```console => [internal] load build definition from Dockerfile 0.0s => => transferring dockerfile: 1.31kB 0.0s => [internal] load metadata for public.ecr.aws/lambda/python:3.13 0.3s => [internal] load metadata for ghcr.io/astral-sh/uv:latest 0.3s => [internal] load .dockerignore 0.0s => => transferring context: 106B 0.0s => [uv 1/1] FROM ghcr.io/astral-sh/uv:latest@sha256:ea61e006cfec0e8d81fae901ad703e09d2c6cf1aa58abcb6507d124b50286f 0.0s => [builder 1/2] FROM public.ecr.aws/lambda/python:3.13@sha256:f5b51b377b80bd303fe8055084e2763336ea8920d12955b23ef 0.0s => [internal] load build context 0.0s => => transferring context: 185B 0.0s => CACHED [builder 2/2] RUN --mount=from=uv,source=/uv,target=/bin/uv --mount=type=cache,target=/root/.cache/u 0.0s => CACHED [stage-2 2/3] COPY --from=builder /var/task /var/task 0.0s => CACHED [stage-2 3/3] COPY ./app /var/task 0.0s => exporting to image 0.0s => => exporting layers 0.0s => => writing image sha256:6f8f9ef715a7cda466b677a9df4046ebbb90c8e88595242ade3b4771f547652d 0.0 ``` After building, we can push the image to [Elastic Container Registry (ECR)](https://aws.amazon.com/ecr/) with, e.g.: ```console $ aws ecr get-login-password --region region | docker login --username AWS --password-stdin aws_account_id.dkr.ecr.region.amazonaws.com $ docker tag fastapi-app:latest aws_account_id.dkr.ecr.region.amazonaws.com/fastapi-app:latest $ docker push aws_account_id.dkr.ecr.region.amazonaws.com/fastapi-app:latest ``` Finally, we can deploy the image to AWS Lambda using the AWS Management Console or the AWS CLI, e.g.: ```console $ aws lambda create-function \ --function-name myFunction \ --package-type Image \ --code ImageUri=aws_account_id.dkr.ecr.region.amazonaws.com/fastapi-app:latest \ --role arn:aws:iam::111122223333:role/my-lambda-role ``` Where the [execution role](https://docs.aws.amazon.com/lambda/latest/dg/lambda-intro-execution-role.html#permissions-executionrole-api) is created via: ```console $ aws iam create-role \ --role-name my-lambda-role \ --assume-role-policy-document '{"Version": "2012-10-17", "Statement": [{ "Effect": "Allow", "Principal": {"Service": "lambda.amazonaws.com"}, "Action": "sts:AssumeRole"}]}' ``` Or, update an existing function with: ```console $ aws lambda update-function-code \ --function-name myFunction \ --image-uri aws_account_id.dkr.ecr.region.amazonaws.com/fastapi-app:latest \ --publish ``` To test the Lambda, we can invoke it via the AWS Management Console or the AWS CLI, e.g.: ```console $ aws lambda invoke \ --function-name myFunction \ --payload file://event.json \ --cli-binary-format raw-in-base64-out \ response.json { "StatusCode": 200, "ExecutedVersion": "$LATEST" } ``` Where `event.json` contains the event payload to pass to the Lambda function: ```json title="event.json" { "httpMethod": "GET", "path": "/", "requestContext": {}, "version": "1.0" } ``` And `response.json` contains the response from the Lambda function: ```json title="response.json" { "statusCode": 200, "headers": { "content-length": "14", "content-type": "application/json" }, "multiValueHeaders": {}, "body": "\"Hello, world!\"", "isBase64Encoded": false } ``` For details, see the [AWS Lambda documentation](https://docs.aws.amazon.com/lambda/latest/dg/python-image.html). ### Workspace support If a project includes local dependencies (e.g., via [Workspaces](../../concepts/projects/workspaces.md)), those too must be included in the deployment package. We'll start by extending the above example to include a dependency on a locally-developed library named `library`. First, we'll create the library itself: ```console $ uv init --lib library $ uv add ./library ``` Running `uv init` within the `project` directory will automatically convert `project` to a workspace and add `library` as a workspace member: ```toml title="pyproject.toml" [project] name = "uv-aws-lambda-example" version = "0.1.0" requires-python = ">=3.13" dependencies = [ # FastAPI is a modern web framework for building APIs with Python. "fastapi", # A local library. "library", # Mangum is a library that adapts ASGI applications to AWS Lambda and API Gateway. "mangum", ] [dependency-groups] dev = [ # In development mode, include the FastAPI development server. "fastapi[standard]", ] [tool.uv.workspace] members = ["library"] [tool.uv.sources] lib = { workspace = true } ``` By default, `uv init --lib` will create a package that exports a `hello` function. We'll modify the application source code to call that function: ```python title="app/main.py" import logging from fastapi import FastAPI from mangum import Mangum from library import hello logger = logging.getLogger() logger.setLevel(logging.INFO) app = FastAPI() handler = Mangum(app) @app.get("/") async def root() -> str: return hello() ``` We can run the modified application locally with: ```console $ uv run fastapi dev ``` And confirm that opening http://127.0.0.1:8000/ in a web browser displays, "Hello from library!" (instead of "Hello, World!") Finally, we'll update the Dockerfile to include the local library in the deployment package: ```dockerfile title="Dockerfile" FROM ghcr.io/astral-sh/uv:0.8.4 AS uv # First, bundle the dependencies into the task root. FROM public.ecr.aws/lambda/python:3.13 AS builder # Enable bytecode compilation, to improve cold-start performance. ENV UV_COMPILE_BYTECODE=1 # Disable installer metadata, to create a deterministic layer. ENV UV_NO_INSTALLER_METADATA=1 # Enable copy mode to support bind mount caching. ENV UV_LINK_MODE=copy # Bundle the dependencies into the Lambda task root via `uv pip install --target`. # # Omit any local packages (`--no-emit-workspace`) and development dependencies (`--no-dev`). # This ensures that the Docker layer cache is only invalidated when the `pyproject.toml` or `uv.lock` # files change, but remains robust to changes in the application code. RUN --mount=from=uv,source=/uv,target=/bin/uv \ --mount=type=cache,target=/root/.cache/uv \ --mount=type=bind,source=uv.lock,target=uv.lock \ --mount=type=bind,source=pyproject.toml,target=pyproject.toml \ uv export --frozen --no-emit-workspace --no-dev --no-editable -o requirements.txt && \ uv pip install -r requirements.txt --target "${LAMBDA_TASK_ROOT}" # If you have a workspace, copy it over and install it too. # # By omitting `--no-emit-workspace`, `library` will be copied into the task root. Using a separate # `RUN` command ensures that all third-party dependencies are cached separately and remain # robust to changes in the workspace. RUN --mount=from=uv,source=/uv,target=/bin/uv \ --mount=type=cache,target=/root/.cache/uv \ --mount=type=bind,source=uv.lock,target=uv.lock \ --mount=type=bind,source=pyproject.toml,target=pyproject.toml \ --mount=type=bind,source=library,target=library \ uv export --frozen --no-dev --no-editable -o requirements.txt && \ uv pip install -r requirements.txt --target "${LAMBDA_TASK_ROOT}" FROM public.ecr.aws/lambda/python:3.13 # Copy the runtime dependencies from the builder stage. COPY --from=builder ${LAMBDA_TASK_ROOT} ${LAMBDA_TASK_ROOT} # Copy the application code. COPY ./app ${LAMBDA_TASK_ROOT}/app # Set the AWS Lambda handler. CMD ["app.main.handler"] ``` !!! tip To deploy to ARM-based AWS Lambda runtimes, replace `public.ecr.aws/lambda/python:3.13` with `public.ecr.aws/lambda/python:3.13-arm64`. From there, we can build and deploy the updated image as before. ## Deploying a zip archive AWS Lambda also supports deployment via zip archives. For simple applications, zip archives can be a more straightforward and efficient deployment method than Docker images; however, zip archives are limited to [250 MB](https://docs.aws.amazon.com/lambda/latest/dg/python-package.html#python-package-create-update) in size. Returning to the FastAPI example, we can bundle the application dependencies into a local directory for AWS Lambda via: ```console $ uv export --frozen --no-dev --no-editable -o requirements.txt $ uv pip install \ --no-installer-metadata \ --no-compile-bytecode \ --python-platform x86_64-manylinux2014 \ --python 3.13 \ --target packages \ -r requirements.txt ``` !!! tip To deploy to ARM-based AWS Lambda runtimes, replace `x86_64-manylinux2014` with `aarch64-manylinux2014`. Following the [AWS Lambda documentation](https://docs.aws.amazon.com/lambda/latest/dg/python-package.html), we can then bundle these dependencies into a zip as follows: ```console $ cd packages $ zip -r ../package.zip . $ cd .. ``` Finally, we can add the application code to the zip archive: ```console $ zip -r package.zip app ``` We can then deploy the zip archive to AWS Lambda via the AWS Management Console or the AWS CLI, e.g.: ```console $ aws lambda create-function \ --function-name myFunction \ --runtime python3.13 \ --zip-file fileb://package.zip \ --handler app.main.handler \ --role arn:aws:iam::111122223333:role/service-role/my-lambda-role ``` Where the [execution role](https://docs.aws.amazon.com/lambda/latest/dg/lambda-intro-execution-role.html#permissions-executionrole-api) is created via: ```console $ aws iam create-role \ --role-name my-lambda-role \ --assume-role-policy-document '{"Version": "2012-10-17", "Statement": [{ "Effect": "Allow", "Principal": {"Service": "lambda.amazonaws.com"}, "Action": "sts:AssumeRole"}]}' ``` Or, update an existing function with: ```console $ aws lambda update-function-code \ --function-name myFunction \ --zip-file fileb://package.zip ``` !!! note By default, the AWS Management Console assumes a Lambda entrypoint of `lambda_function.lambda_handler`. If your application uses a different entrypoint, you'll need to modify it in the AWS Management Console. For example, the above FastAPI application uses `app.main.handler`. To test the Lambda, we can invoke it via the AWS Management Console or the AWS CLI, e.g.: ```console $ aws lambda invoke \ --function-name myFunction \ --payload file://event.json \ --cli-binary-format raw-in-base64-out \ response.json { "StatusCode": 200, "ExecutedVersion": "$LATEST" } ``` Where `event.json` contains the event payload to pass to the Lambda function: ```json title="event.json" { "httpMethod": "GET", "path": "/", "requestContext": {}, "version": "1.0" } ``` And `response.json` contains the response from the Lambda function: ```json title="response.json" { "statusCode": 200, "headers": { "content-length": "14", "content-type": "application/json" }, "multiValueHeaders": {}, "body": "\"Hello, world!\"", "isBase64Encoded": false } ``` ### Using a Lambda layer AWS Lambda also supports the deployment of multiple composed [Lambda layers](https://docs.aws.amazon.com/lambda/latest/dg/python-layers.html) when working with zip archives. These layers are conceptually similar to layers in a Docker image, allowing you to separate application code from dependencies. In particular, we can create a lambda layer for application dependencies and attach it to the Lambda function, separate from the application code itself. This setup can improve cold-start performance for application updates, as the dependencies layer can be reused across deployments. To create a Lambda layer, we'll follow similar steps, but create two separate zip archives: one for the application code and one for the application dependencies. First, we'll create the dependency layer. Lambda layers are expected to follow a slightly different structure, so we'll use `--prefix` rather than `--target`: ```console $ uv export --frozen --no-dev --no-editable -o requirements.txt $ uv pip install \ --no-installer-metadata \ --no-compile-bytecode \ --python-platform x86_64-manylinux2014 \ --python 3.13 \ --prefix packages \ -r requirements.txt ``` We'll then zip the dependencies in adherence with the expected layout for Lambda layers: ```console $ mkdir python $ cp -r packages/lib python/ $ zip -r layer_content.zip python ``` !!! tip To generate deterministic zip archives, consider passing the `-X` flag to `zip` to exclude extended attributes and file system metadata. And publish the Lambda layer: ```console $ aws lambda publish-layer-version --layer-name dependencies-layer \ --zip-file fileb://layer_content.zip \ --compatible-runtimes python3.13 \ --compatible-architectures "x86_64" ``` We can then create the Lambda function as in the previous example, omitting the dependencies: ```console $ # Zip the application code. $ zip -r app.zip app $ # Create the Lambda function. $ aws lambda create-function \ --function-name myFunction \ --runtime python3.13 \ --zip-file fileb://app.zip \ --handler app.main.handler \ --role arn:aws:iam::111122223333:role/service-role/my-lambda-role ``` Finally, we can attach the dependencies layer to the Lambda function, using the ARN returned by the `publish-layer-version` step: ```console $ aws lambda update-function-configuration --function-name myFunction \ --cli-binary-format raw-in-base64-out \ --layers "arn:aws:lambda:region:111122223333:layer:dependencies-layer:1" ``` When the application dependencies change, the layer can be updated independently of the application by republishing the layer and updating the Lambda function configuration: ```console $ # Update the dependencies in the layer. $ aws lambda publish-layer-version --layer-name dependencies-layer \ --zip-file fileb://layer_content.zip \ --compatible-runtimes python3.13 \ --compatible-architectures "x86_64" $ # Update the Lambda function configuration. $ aws lambda update-function-configuration --function-name myFunction \ --cli-binary-format raw-in-base64-out \ --layers "arn:aws:lambda:region:111122223333:layer:dependencies-layer:2" ``` --- title: Using uv with dependency bots description: A guide to using uv with dependency bots like Renovate and Dependabot. --- # Dependency bots It is considered best practice to regularly update dependencies, to avoid being exposed to vulnerabilities, limit incompatibilities between dependencies, and avoid complex upgrades when upgrading from a too old version. A variety of tools can help staying up-to-date by creating automated pull requests. Several of them support uv, or have work underway to support it. ## Renovate uv is supported by [Renovate](https://github.com/renovatebot/renovate). ### `uv.lock` output Renovate uses the presence of a `uv.lock` file to determine that uv is used for managing dependencies, and will suggest upgrades to [project dependencies](../../concepts/projects/dependencies.md#project-dependencies), [optional dependencies](../../concepts/projects/dependencies.md#optional-dependencies) and [development dependencies](../../concepts/projects/dependencies.md#development-dependencies). Renovate will update both the `pyproject.toml` and `uv.lock` files. The lockfile can also be refreshed on a regular basis (for instance to update transitive dependencies) by enabling the [`lockFileMaintenance`](https://docs.renovatebot.com/configuration-options/#lockfilemaintenance) option: ```jsx title="renovate.json5" { $schema: "https://docs.renovatebot.com/renovate-schema.json", lockFileMaintenance: { enabled: true, }, } ``` ### Inline script metadata Renovate supports updating dependencies defined using [script inline metadata](../scripts.md/#declaring-script-dependencies). Since it cannot automatically detect which Python files use script inline metadata, their locations need to be explicitly defined using [`fileMatch`](https://docs.renovatebot.com/configuration-options/#filematch), like so: ```jsx title="renovate.json5" { $schema: "https://docs.renovatebot.com/renovate-schema.json", pep723: { fileMatch: [ "scripts/generate_docs\\.py", "scripts/run_server\\.py", ], }, } ``` ## Dependabot Dependabot has announced support for uv, but there are some use cases that are not yet working. See [astral-sh/uv#2512](https://github.com/astral-sh/uv/issues/2512) for updates. Dependabot supports updating `uv.lock` files. To enable it, add the uv `package-ecosystem` to your `updates` list in the `dependabot.yml`: ```yaml title="dependabot.yml" version: 2 updates: - package-ecosystem: "uv" directory: "/" schedule: interval: "weekly" ``` --- title: Using uv in Docker description: A complete guide to using uv in Docker to manage Python dependencies while optimizing build times and image size via multi-stage builds, intermediate layers, and more. --- # Using uv in Docker ## Getting started !!! tip Check out the [`uv-docker-example`](https://github.com/astral-sh/uv-docker-example) project for an example of best practices when using uv to build an application in Docker. uv provides both _distroless_ Docker images, which are useful for [copying uv binaries](#installing-uv) into your own image builds, and images derived from popular base images, which are useful for using uv in a container. The distroless images do not contain anything but the uv binaries. In contrast, the derived images include an operating system with uv pre-installed. As an example, to run uv in a container using a Debian-based image: ```console $ docker run --rm -it ghcr.io/astral-sh/uv:debian uv --help ``` ### Available images The following distroless images are available: - `ghcr.io/astral-sh/uv:latest` - `ghcr.io/astral-sh/uv:{major}.{minor}.{patch}`, e.g., `ghcr.io/astral-sh/uv:0.8.4` - `ghcr.io/astral-sh/uv:{major}.{minor}`, e.g., `ghcr.io/astral-sh/uv:0.8` (the latest patch version) And the following derived images are available: - Based on `alpine:3.21`: - `ghcr.io/astral-sh/uv:alpine` - `ghcr.io/astral-sh/uv:alpine3.21` - Based on `debian:bookworm-slim`: - `ghcr.io/astral-sh/uv:debian-slim` - `ghcr.io/astral-sh/uv:bookworm-slim` - Based on `buildpack-deps:bookworm`: - `ghcr.io/astral-sh/uv:debian` - `ghcr.io/astral-sh/uv:bookworm` - Based on `python3.x-alpine`: - `ghcr.io/astral-sh/uv:python3.14-rc-alpine` - `ghcr.io/astral-sh/uv:python3.13-alpine` - `ghcr.io/astral-sh/uv:python3.12-alpine` - `ghcr.io/astral-sh/uv:python3.11-alpine` - `ghcr.io/astral-sh/uv:python3.10-alpine` - `ghcr.io/astral-sh/uv:python3.9-alpine` - `ghcr.io/astral-sh/uv:python3.8-alpine` - Based on `python3.x-bookworm`: - `ghcr.io/astral-sh/uv:python3.14-rc-bookworm` - `ghcr.io/astral-sh/uv:python3.13-bookworm` - `ghcr.io/astral-sh/uv:python3.12-bookworm` - `ghcr.io/astral-sh/uv:python3.11-bookworm` - `ghcr.io/astral-sh/uv:python3.10-bookworm` - `ghcr.io/astral-sh/uv:python3.9-bookworm` - `ghcr.io/astral-sh/uv:python3.8-bookworm` - Based on `python3.x-slim-bookworm`: - `ghcr.io/astral-sh/uv:python3.14-rc-bookworm-slim` - `ghcr.io/astral-sh/uv:python3.13-bookworm-slim` - `ghcr.io/astral-sh/uv:python3.12-bookworm-slim` - `ghcr.io/astral-sh/uv:python3.11-bookworm-slim` - `ghcr.io/astral-sh/uv:python3.10-bookworm-slim` - `ghcr.io/astral-sh/uv:python3.9-bookworm-slim` - `ghcr.io/astral-sh/uv:python3.8-bookworm-slim` As with the distroless image, each derived image is published with uv version tags as `ghcr.io/astral-sh/uv:{major}.{minor}.{patch}-{base}` and `ghcr.io/astral-sh/uv:{major}.{minor}-{base}`, e.g., `ghcr.io/astral-sh/uv:0.8.4-alpine`. In addition, starting with `0.8` each derived image also sets `UV_TOOL_BIN_DIR` to `/usr/local/bin` to allow `uv tool install` to work as expected with the default user. For more details, see the [GitHub Container](https://github.com/astral-sh/uv/pkgs/container/uv) page. ### Installing uv Use one of the above images with uv pre-installed or install uv by copying the binary from the official distroless Docker image: ```dockerfile title="Dockerfile" FROM python:3.12-slim-bookworm COPY --from=ghcr.io/astral-sh/uv:latest /uv /uvx /bin/ ``` Or, with the installer: ```dockerfile title="Dockerfile" FROM python:3.12-slim-bookworm # The installer requires curl (and certificates) to download the release archive RUN apt-get update && apt-get install -y --no-install-recommends curl ca-certificates # Download the latest installer ADD https://astral.sh/uv/install.sh /uv-installer.sh # Run the installer then remove it RUN sh /uv-installer.sh && rm /uv-installer.sh # Ensure the installed binary is on the `PATH` ENV PATH="/root/.local/bin/:$PATH" ``` Note this requires `curl` to be available. In either case, it is best practice to pin to a specific uv version, e.g., with: ```dockerfile COPY --from=ghcr.io/astral-sh/uv:0.8.4 /uv /uvx /bin/ ``` !!! tip While the Dockerfile example above pins to a specific tag, it's also possible to pin a specific SHA256. Pinning a specific SHA256 is considered best practice in environments that require reproducible builds as tags can be moved across different commit SHAs. ```Dockerfile # e.g., using a hash from a previous release COPY --from=ghcr.io/astral-sh/uv@sha256:2381d6aa60c326b71fd40023f921a0a3b8f91b14d5db6b90402e65a635053709 /uv /uvx /bin/ ``` Or, with the installer: ```dockerfile ADD https://astral.sh/uv/0.8.4/install.sh /uv-installer.sh ``` ### Installing a project If you're using uv to manage your project, you can copy it into the image and install it: ```dockerfile title="Dockerfile" # Copy the project into the image ADD . /app # Sync the project into a new environment, asserting the lockfile is up to date WORKDIR /app RUN uv sync --locked ``` !!! important It is best practice to add `.venv` to a [`.dockerignore` file](https://docs.docker.com/build/concepts/context/#dockerignore-files) in your repository to prevent it from being included in image builds. The project virtual environment is dependent on your local platform and should be created from scratch in the image. Then, to start your application by default: ```dockerfile title="Dockerfile" # Presuming there is a `my_app` command provided by the project CMD ["uv", "run", "my_app"] ``` !!! tip It is best practice to use [intermediate layers](#intermediate-layers) separating installation of dependencies and the project itself to improve Docker image build times. See a complete example in the [`uv-docker-example` project](https://github.com/astral-sh/uv-docker-example/blob/main/Dockerfile). ### Using the environment Once the project is installed, you can either _activate_ the project virtual environment by placing its binary directory at the front of the path: ```dockerfile title="Dockerfile" ENV PATH="/app/.venv/bin:$PATH" ``` Or, you can use `uv run` for any commands that require the environment: ```dockerfile title="Dockerfile" RUN uv run some_script.py ``` !!! tip Alternatively, the [`UV_PROJECT_ENVIRONMENT` setting](../../concepts/projects/config.md#project-environment-path) can be set before syncing to install to the system Python environment and skip environment activation entirely. ### Using installed tools To use installed tools, ensure the [tool bin directory](../../concepts/tools.md#the-bin-directory) is on the path: ```dockerfile title="Dockerfile" ENV PATH=/root/.local/bin:$PATH RUN uv tool install cowsay ``` ```console $ docker run -it $(docker build -q .) /bin/bash -c "cowsay -t hello" _____ | hello | ===== \ \ ^__^ (oo)\_______ (__)\ )\/\ ||----w | || || ``` !!! note The tool bin directory's location can be determined by running the `uv tool dir --bin` command in the container. Alternatively, it can be set to a constant location: ```dockerfile title="Dockerfile" ENV UV_TOOL_BIN_DIR=/opt/uv-bin/ ``` ### Installing Python in ARM musl images While uv will attempt to [install a compatible Python version](../install-python.md) if no such version is available in the image, uv does not yet support installing Python for musl Linux on ARM. For example, if you are using an Alpine Linux base image on an ARM machine, you may need to add it with the system package manager: ```shell apk add --no-cache python3~=3.12 ``` ## Developing in a container When developing, it's useful to mount the project directory into a container. With this setup, changes to the project can be immediately reflected in a containerized service without rebuilding the image. However, it is important _not_ to include the project virtual environment (`.venv`) in the mount, because the virtual environment is platform specific and the one built for the image should be kept. ### Mounting the project with `docker run` Bind mount the project (in the working directory) to `/app` while retaining the `.venv` directory with an [anonymous volume](https://docs.docker.com/engine/storage/#volumes): ```console $ docker run --rm --volume .:/app --volume /app/.venv [...] ``` !!! tip The `--rm` flag is included to ensure the container and anonymous volume are cleaned up when the container exits. See a complete example in the [`uv-docker-example` project](https://github.com/astral-sh/uv-docker-example/blob/main/run.sh). ### Configuring `watch` with `docker compose` When using Docker compose, more sophisticated tooling is available for container development. The [`watch`](https://docs.docker.com/compose/file-watch/#compose-watch-versus-bind-mounts) option allows for greater granularity than is practical with a bind mount and supports triggering updates to the containerized service when files change. !!! note This feature requires Compose 2.22.0 which is bundled with Docker Desktop 4.24. Configure `watch` in your [Docker compose file](https://docs.docker.com/compose/compose-application-model/#the-compose-file) to mount the project directory without syncing the project virtual environment and to rebuild the image when the configuration changes: ```yaml title="compose.yaml" services: example: build: . # ... develop: # Create a `watch` configuration to update the app # watch: # Sync the working directory with the `/app` directory in the container - action: sync path: . target: /app # Exclude the project virtual environment ignore: - .venv/ # Rebuild the image on changes to the `pyproject.toml` - action: rebuild path: ./pyproject.toml ``` Then, run `docker compose watch` to run the container with the development setup. See a complete example in the [`uv-docker-example` project](https://github.com/astral-sh/uv-docker-example/blob/main/compose.yml). ## Optimizations ### Compiling bytecode Compiling Python source files to bytecode is typically desirable for production images as it tends to improve startup time (at the cost of increased installation time). To enable bytecode compilation, use the `--compile-bytecode` flag: ```dockerfile title="Dockerfile" RUN uv sync --compile-bytecode ``` Alternatively, you can set the `UV_COMPILE_BYTECODE` environment variable to ensure that all commands within the Dockerfile compile bytecode: ```dockerfile title="Dockerfile" ENV UV_COMPILE_BYTECODE=1 ``` ### Caching A [cache mount](https://docs.docker.com/build/guide/mounts/#add-a-cache-mount) can be used to improve performance across builds: ```dockerfile title="Dockerfile" ENV UV_LINK_MODE=copy RUN --mount=type=cache,target=/root/.cache/uv \ uv sync ``` Changing the default [`UV_LINK_MODE`](../../reference/settings.md#link-mode) silences warnings about not being able to use hard links since the cache and sync target are on separate file systems. If you're not mounting the cache, image size can be reduced by using the `--no-cache` flag or setting `UV_NO_CACHE`. !!! note The cache directory's location can be determined by running the `uv cache dir` command in the container. Alternatively, the cache can be set to a constant location: ```dockerfile title="Dockerfile" ENV UV_CACHE_DIR=/opt/uv-cache/ ``` ### Intermediate layers If you're using uv to manage your project, you can improve build times by moving your transitive dependency installation into its own layer via the `--no-install` options. `uv sync --no-install-project` will install the dependencies of the project but not the project itself. Since the project changes frequently, but its dependencies are generally static, this can be a big time saver. ```dockerfile title="Dockerfile" # Install uv FROM python:3.12-slim COPY --from=ghcr.io/astral-sh/uv:latest /uv /uvx /bin/ # Change the working directory to the `app` directory WORKDIR /app # Install dependencies RUN --mount=type=cache,target=/root/.cache/uv \ --mount=type=bind,source=uv.lock,target=uv.lock \ --mount=type=bind,source=pyproject.toml,target=pyproject.toml \ uv sync --locked --no-install-project # Copy the project into the image ADD . /app # Sync the project RUN --mount=type=cache,target=/root/.cache/uv \ uv sync --locked ``` Note that the `pyproject.toml` is required to identify the project root and name, but the project _contents_ are not copied into the image until the final `uv sync` command. !!! tip If you're using a [workspace](../../concepts/projects/workspaces.md), then use the `--no-install-workspace` flag which excludes the project _and_ any workspace members. If you want to remove specific packages from the sync, use `--no-install-package `. ### Non-editable installs By default, uv installs projects and workspace members in editable mode, such that changes to the source code are immediately reflected in the environment. `uv sync` and `uv run` both accept a `--no-editable` flag, which instructs uv to install the project in non-editable mode, removing any dependency on the source code. In the context of a multi-stage Docker image, `--no-editable` can be used to include the project in the synced virtual environment from one stage, then copy the virtual environment alone (and not the source code) into the final image. For example: ```dockerfile title="Dockerfile" # Install uv FROM python:3.12-slim AS builder COPY --from=ghcr.io/astral-sh/uv:latest /uv /uvx /bin/ # Change the working directory to the `app` directory WORKDIR /app # Install dependencies RUN --mount=type=cache,target=/root/.cache/uv \ --mount=type=bind,source=uv.lock,target=uv.lock \ --mount=type=bind,source=pyproject.toml,target=pyproject.toml \ uv sync --locked --no-install-project --no-editable # Copy the project into the intermediate image ADD . /app # Sync the project RUN --mount=type=cache,target=/root/.cache/uv \ uv sync --locked --no-editable FROM python:3.12-slim # Copy the environment, but not the source code COPY --from=builder --chown=app:app /app/.venv /app/.venv # Run the application CMD ["/app/.venv/bin/hello"] ``` ### Using uv temporarily If uv isn't needed in the final image, the binary can be mounted in each invocation: ```dockerfile title="Dockerfile" RUN --mount=from=ghcr.io/astral-sh/uv,source=/uv,target=/bin/uv \ uv sync ``` ## Using the pip interface ### Installing a package The system Python environment is safe to use this context, since a container is already isolated. The `--system` flag can be used to install in the system environment: ```dockerfile title="Dockerfile" RUN uv pip install --system ruff ``` To use the system Python environment by default, set the `UV_SYSTEM_PYTHON` variable: ```dockerfile title="Dockerfile" ENV UV_SYSTEM_PYTHON=1 ``` Alternatively, a virtual environment can be created and activated: ```dockerfile title="Dockerfile" RUN uv venv /opt/venv # Use the virtual environment automatically ENV VIRTUAL_ENV=/opt/venv # Place entry points in the environment at the front of the path ENV PATH="/opt/venv/bin:$PATH" ``` When using a virtual environment, the `--system` flag should be omitted from uv invocations: ```dockerfile title="Dockerfile" RUN uv pip install ruff ``` ### Installing requirements To install requirements files, copy them into the container: ```dockerfile title="Dockerfile" COPY requirements.txt . RUN uv pip install -r requirements.txt ``` ### Installing a project When installing a project alongside requirements, it is best practice to separate copying the requirements from the rest of the source code. This allows the dependencies of the project (which do not change often) to be cached separately from the project itself (which changes very frequently). ```dockerfile title="Dockerfile" COPY pyproject.toml . RUN uv pip install -r pyproject.toml COPY . . RUN uv pip install -e . ``` ## Verifying image provenance The Docker images are signed during the build process to provide proof of their origin. These attestations can be used to verify that an image was produced from an official channel. For example, you can verify the attestations with the [GitHub CLI tool `gh`](https://cli.github.com/): ```console $ gh attestation verify --owner astral-sh oci://ghcr.io/astral-sh/uv:latest Loaded digest sha256:xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx for oci://ghcr.io/astral-sh/uv:latest Loaded 1 attestation from GitHub API The following policy criteria will be enforced: - OIDC Issuer must match:................... https://token.actions.githubusercontent.com - Source Repository Owner URI must match:... https://github.com/astral-sh - Predicate type must match:................ https://slsa.dev/provenance/v1 - Subject Alternative Name must match regex: (?i)^https://github.com/astral-sh/ ✓ Verification succeeded! sha256:xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx was attested by: REPO PREDICATE_TYPE WORKFLOW astral-sh/uv https://slsa.dev/provenance/v1 .github/workflows/build-docker.yml@refs/heads/main ``` This tells you that the specific Docker image was built by the official uv GitHub release workflow and hasn't been tampered with since. GitHub attestations build on the [sigstore.dev infrastructure](https://www.sigstore.dev/). As such you can also use the [`cosign` command](https://github.com/sigstore/cosign) to verify the attestation blob against the (multi-platform) manifest for `uv`: ```console $ REPO=astral-sh/uv $ gh attestation download --repo $REPO oci://ghcr.io/${REPO}:latest Wrote attestations to file sha256:xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx.jsonl. Any previous content has been overwritten The trusted metadata is now available at sha256:xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx.jsonl $ docker buildx imagetools inspect ghcr.io/${REPO}:latest --format "{{json .Manifest}}" > manifest.json $ cosign verify-blob-attestation \ --new-bundle-format \ --bundle "$(jq -r .digest manifest.json).jsonl" \ --certificate-oidc-issuer="https://token.actions.githubusercontent.com" \ --certificate-identity-regexp="^https://github\.com/${REPO}/.*" \ <(jq -j '.|del(.digest,.size)' manifest.json) Verified OK ``` !!! tip These examples use `latest`, but best practice is to verify the attestation for a specific version tag, e.g., `ghcr.io/astral-sh/uv:0.8.4`, or (even better) the specific image digest, such as `ghcr.io/astral-sh/uv:0.5.27@sha256:5adf09a5a526f380237408032a9308000d14d5947eafa687ad6c6a2476787b4f`. --- title: Using uv with FastAPI description: A guide to using uv with FastAPI to manage Python dependencies, run applications, and deploy with Docker. --- # Using uv with FastAPI [FastAPI](https://github.com/fastapi/fastapi) is a modern, high-performance Python web framework. You can use uv to manage your FastAPI project, including installing dependencies, managing environments, running FastAPI applications, and more. !!! note You can view the source code for this guide in the [uv-fastapi-example](https://github.com/astral-sh/uv-fastapi-example) repository. ## Migrating an existing FastAPI project As an example, consider the sample application defined in the [FastAPI documentation](https://fastapi.tiangolo.com/tutorial/bigger-applications/), structured as follows: ```plaintext project └── app ├── __init__.py ├── main.py ├── dependencies.py ├── routers │ ├── __init__.py │ ├── items.py │ └── users.py └── internal ├── __init__.py └── admin.py ``` To use uv with this application, inside the `project` directory run: ```console $ uv init --app ``` This creates a [project with an application layout](../../concepts/projects/init.md#applications) and a `pyproject.toml` file. Then, add a dependency on FastAPI: ```console $ uv add fastapi --extra standard ``` You should now have the following structure: ```plaintext project ├── pyproject.toml └── app ├── __init__.py ├── main.py ├── dependencies.py ├── routers │ ├── __init__.py │ ├── items.py │ └── users.py └── internal ├── __init__.py └── admin.py ``` And the contents of the `pyproject.toml` file should look something like this: ```toml title="pyproject.toml" [project] name = "uv-fastapi-example" version = "0.1.0" description = "FastAPI project" readme = "README.md" requires-python = ">=3.12" dependencies = [ "fastapi[standard]", ] ``` From there, you can run the FastAPI application with: ```console $ uv run fastapi dev ``` `uv run` will automatically resolve and lock the project dependencies (i.e., create a `uv.lock` alongside the `pyproject.toml`), create a virtual environment, and run the command in that environment. Test the app by opening http://127.0.0.1:8000/?token=jessica in a web browser. ## Deployment To deploy the FastAPI application with Docker, you can use the following `Dockerfile`: ```dockerfile title="Dockerfile" FROM python:3.12-slim # Install uv. COPY --from=ghcr.io/astral-sh/uv:latest /uv /uvx /bin/ # Copy the application into the container. COPY . /app # Install the application dependencies. WORKDIR /app RUN uv sync --frozen --no-cache # Run the application. CMD ["/app/.venv/bin/fastapi", "run", "app/main.py", "--port", "80", "--host", "0.0.0.0"] ``` Build the Docker image with: ```console $ docker build -t fastapi-app . ``` Run the Docker container locally with: ```console $ docker run -p 8000:80 fastapi-app ``` Navigate to http://127.0.0.1:8000/?token=jessica in your browser to verify that the app is running correctly. !!! tip For more on using uv with Docker, see the [Docker guide](./docker.md). --- title: Using uv in GitHub Actions description: A guide to using uv in GitHub Actions, including installation, setting up Python, installing dependencies, and more. --- # Using uv in GitHub Actions ## Installation For use with GitHub Actions, we recommend the official [`astral-sh/setup-uv`](https://github.com/astral-sh/setup-uv) action, which installs uv, adds it to PATH, (optionally) persists the cache, and more, with support for all uv-supported platforms. To install the latest version of uv: ```yaml title="example.yml" hl_lines="11-12" name: Example jobs: uv-example: name: python runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 - name: Install uv uses: astral-sh/setup-uv@v6 ``` It is considered best practice to pin to a specific uv version, e.g., with: ```yaml title="example.yml" hl_lines="14 15" name: Example jobs: uv-example: name: python runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 - name: Install uv uses: astral-sh/setup-uv@v6 with: # Install a specific version of uv. version: "0.8.4" ``` ## Setting up Python Python can be installed with the `python install` command: ```yaml title="example.yml" hl_lines="14 15" name: Example jobs: uv-example: name: python runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 - name: Install uv uses: astral-sh/setup-uv@v6 - name: Set up Python run: uv python install ``` This will respect the Python version pinned in the project. Alternatively, the official GitHub `setup-python` action can be used. This can be faster, because GitHub caches the Python versions alongside the runner. Set the [`python-version-file`](https://github.com/actions/setup-python/blob/main/docs/advanced-usage.md#using-the-python-version-file-input) option to use the pinned version for the project: ```yaml title="example.yml" hl_lines="14 15 16 17" name: Example jobs: uv-example: name: python runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 - name: "Set up Python" uses: actions/setup-python@v5 with: python-version-file: ".python-version" - name: Install uv uses: astral-sh/setup-uv@v6 ``` Or, specify the `pyproject.toml` file to ignore the pin and use the latest version compatible with the project's `requires-python` constraint: ```yaml title="example.yml" hl_lines="17" name: Example jobs: uv-example: name: python runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 - name: "Set up Python" uses: actions/setup-python@v5 with: python-version-file: "pyproject.toml" - name: Install uv uses: astral-sh/setup-uv@v6 ``` ## Multiple Python versions When using a matrix to test multiple Python versions, set the Python version using `astral-sh/setup-uv`, which will override the Python version specification in the `pyproject.toml` or `.python-version` files: ```yaml title="example.yml" hl_lines="17 18" jobs: build: name: continuous-integration runs-on: ubuntu-latest strategy: matrix: python-version: - "3.10" - "3.11" - "3.12" steps: - uses: actions/checkout@v4 - name: Install uv and set the python version uses: astral-sh/setup-uv@v6 with: python-version: ${{ matrix.python-version }} ``` If not using the `setup-uv` action, you can set the `UV_PYTHON` environment variable: ```yaml title="example.yml" hl_lines="12" jobs: build: name: continuous-integration runs-on: ubuntu-latest strategy: matrix: python-version: - "3.10" - "3.11" - "3.12" env: UV_PYTHON: ${{ matrix.python-version }} steps: - uses: actions/checkout@v4 ``` ## Syncing and running Once uv and Python are installed, the project can be installed with `uv sync` and commands can be run in the environment with `uv run`: ```yaml title="example.yml" hl_lines="17-22" name: Example jobs: uv-example: name: python runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 - name: Install uv uses: astral-sh/setup-uv@v6 - name: Install the project run: uv sync --locked --all-extras --dev - name: Run tests # For example, using `pytest` run: uv run pytest tests ``` !!! tip The [`UV_PROJECT_ENVIRONMENT` setting](../../concepts/projects/config.md#project-environment-path) can be used to install to the system Python environment instead of creating a virtual environment. ## Caching It may improve CI times to store uv's cache across workflow runs. The [`astral-sh/setup-uv`](https://github.com/astral-sh/setup-uv) has built-in support for persisting the cache: ```yaml title="example.yml" - name: Enable caching uses: astral-sh/setup-uv@v6 with: enable-cache: true ``` Alternatively, you can manage the cache manually with the `actions/cache` action: ```yaml title="example.yml" jobs: install_job: env: # Configure a constant location for the uv cache UV_CACHE_DIR: /tmp/.uv-cache steps: # ... setup up Python and uv ... - name: Restore uv cache uses: actions/cache@v4 with: path: /tmp/.uv-cache key: uv-${{ runner.os }}-${{ hashFiles('uv.lock') }} restore-keys: | uv-${{ runner.os }}-${{ hashFiles('uv.lock') }} uv-${{ runner.os }} # ... install packages, run tests, etc ... - name: Minimize uv cache run: uv cache prune --ci ``` The `uv cache prune --ci` command is used to reduce the size of the cache and is optimized for CI. Its effect on performance is dependent on the packages being installed. !!! tip If using `uv pip`, use `requirements.txt` instead of `uv.lock` in the cache key. !!! note [post-job-hook]: https://docs.github.com/en/actions/hosting-your-own-runners/managing-self-hosted-runners/running-scripts-before-or-after-a-job When using non-ephemeral, self-hosted runners the default cache directory can grow unbounded. In this case, it may not be optimal to share the cache between jobs. Instead, move the cache inside the GitHub Workspace and remove it once the job finishes using a [Post Job Hook][post-job-hook]. ```yaml install_job: env: # Configure a relative location for the uv cache UV_CACHE_DIR: ${{ github.workspace }}/.cache/uv ``` Using a post job hook requires setting the `ACTIONS_RUNNER_HOOK_JOB_STARTED` environment variable on the self-hosted runner to the path of a cleanup script such as the one shown below. ```sh title="clean-uv-cache.sh" #!/usr/bin/env sh uv cache clean ``` ## Using `uv pip` If using the `uv pip` interface instead of the uv project interface, uv requires a virtual environment by default. To allow installing packages into the system environment, use the `--system` flag on all `uv` invocations or set the `UV_SYSTEM_PYTHON` variable. The `UV_SYSTEM_PYTHON` variable can be defined in at different scopes. Opt-in for the entire workflow by defining it at the top level: ```yaml title="example.yml" env: UV_SYSTEM_PYTHON: 1 jobs: ... ``` Or, opt-in for a specific job in the workflow: ```yaml title="example.yml" jobs: install_job: env: UV_SYSTEM_PYTHON: 1 ... ``` Or, opt-in for a specific step in a job: ```yaml title="example.yml" steps: - name: Install requirements run: uv pip install -r requirements.txt env: UV_SYSTEM_PYTHON: 1 ``` To opt-out again, the `--no-system` flag can be used in any uv invocation. ## Private repos If your project has [dependencies](../../concepts/projects/dependencies.md#git) on private GitHub repositories, you will need to configure a [personal access token (PAT)][PAT] to allow uv to fetch them. After creating a PAT that has read access to the private repositories, add it as a [repository secret]. Then, you can use the [`gh`](https://cli.github.com/) CLI (which is installed in GitHub Actions runners by default) to configure a [credential helper for Git](../../concepts/authentication.md#git-credential-helpers) to use the PAT for queries to repositories hosted on `github.com`. For example, if you called your repository secret `MY_PAT`: ```yaml title="example.yml" steps: - name: Register the personal access token run: echo "${{ secrets.MY_PAT }}" | gh auth login --with-token - name: Configure the Git credential helper run: gh auth setup-git ``` [PAT]: https://docs.github.com/en/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens [repository secret]: https://docs.github.com/en/actions/security-for-github-actions/security-guides/using-secrets-in-github-actions#creating-secrets-for-a-repository --- title: Using uv in GitLab CI/CD description: A guide to using uv in GitLab CI/CD, including installation, setting up Python, installing dependencies, and more. --- # Using uv in GitLab CI/CD ## Using the uv image Astral provides [Docker images](docker.md#available-images) with uv preinstalled. Select a variant that is suitable for your workflow. ```yaml title="gitlab-ci.yml" variables: UV_VERSION: "0.5" PYTHON_VERSION: "3.12" BASE_LAYER: bookworm-slim # GitLab CI creates a separate mountpoint for the build directory, # so we need to copy instead of using hard links. UV_LINK_MODE: copy uv: image: ghcr.io/astral-sh/uv:$UV_VERSION-python$PYTHON_VERSION-$BASE_LAYER script: # your `uv` commands ``` !!! note If you are using a distroless image, you have to specify the entrypoint: ```yaml uv: image: name: ghcr.io/astral-sh/uv:$UV_VERSION entrypoint: [""] # ... ``` ## Caching Persisting the uv cache between workflow runs can improve performance. ```yaml uv-install: variables: UV_CACHE_DIR: .uv-cache cache: - key: files: - uv.lock paths: - $UV_CACHE_DIR script: # Your `uv` commands - uv cache prune --ci ``` See the [GitLab caching documentation](https://docs.gitlab.com/ee/ci/caching/) for more details on configuring caching. Using `uv cache prune --ci` at the end of the job is recommended to reduce cache size. See the [uv cache documentation](../../concepts/cache.md#caching-in-continuous-integration) for more details. ## Using `uv pip` If using the `uv pip` interface instead of the uv project interface, uv requires a virtual environment by default. To allow installing packages into the system environment, use the `--system` flag on all uv invocations or set the `UV_SYSTEM_PYTHON` variable. The `UV_SYSTEM_PYTHON` variable can be defined in at different scopes. You can read more about how [variables and their precedence works in GitLab here](https://docs.gitlab.com/ee/ci/variables/) Opt-in for the entire workflow by defining it at the top level: ```yaml title="gitlab-ci.yml" variables: UV_SYSTEM_PYTHON: 1 # [...] ``` To opt-out again, the `--no-system` flag can be used in any uv invocation. When persisting the cache, you may want to use `requirements.txt` or `pyproject.toml` as your cache key files instead of `uv.lock`. # Integration guides Learn how to integrate uv with other software: - [Using in Docker images](./docker.md) - [Using with Jupyter notebooks](./jupyter.md) - [Using with marimo notebooks](./marimo.md) - [Using with pre-commit](./pre-commit.md) - [Using in GitHub Actions](./github.md) - [Using in GitLab CI/CD](./gitlab.md) - [Using with alternative package indexes](./alternative-indexes.md) - [Installing PyTorch](./pytorch.md) - [Building a FastAPI application](./fastapi.md) - [Using with AWS Lambda](./aws-lambda.md) Or, explore the [concept documentation](../../concepts/index.md) for comprehensive breakdown of each feature. --- title: Using uv with Jupyter description: A complete guide to using uv with Jupyter notebooks for interactive computing, data analysis, and visualization, including kernel management and virtual environment integration. --- # Using uv with Jupyter The [Jupyter](https://jupyter.org/) notebook is a popular tool for interactive computing, data analysis, and visualization. You can use Jupyter with uv in a few different ways, either to interact with a project, or as a standalone tool. ## Using Jupyter within a project If you're working within a [project](../../concepts/projects/index.md), you can start a Jupyter server with access to the project's virtual environment via the following: ```console $ uv run --with jupyter jupyter lab ``` By default, `jupyter lab` will start the server at [http://localhost:8888/lab](http://localhost:8888/lab). Within a notebook, you can import your project's modules as you would in any other file in the project. For example, if your project depends on `requests`, `import requests` will import `requests` from the project's virtual environment. If you're looking for read-only access to the project's virtual environment, then there's nothing more to it. However, if you need to install additional packages from within the notebook, there are a few extra details to consider. ### Creating a kernel If you need to install packages from within the notebook, we recommend creating a dedicated kernel for your project. Kernels enable the Jupyter server to run in one environment, with individual notebooks running in their own, separate environments. In the context of uv, we can create a kernel for a project while installing Jupyter itself in an isolated environment, as in `uv run --with jupyter jupyter lab`. Creating a kernel for the project ensures that the notebook is hooked up to the correct environment, and that any packages installed from within the notebook are installed into the project's virtual environment. To create a kernel, you'll need to install `ipykernel` as a development dependency: ```console $ uv add --dev ipykernel ``` Then, you can create the kernel for `project` with: ```console $ uv run ipython kernel install --user --env VIRTUAL_ENV $(pwd)/.venv --name=project ``` From there, start the server with: ```console $ uv run --with jupyter jupyter lab ``` When creating a notebook, select the `project` kernel from the dropdown. Then use `!uv add pydantic` to add `pydantic` to the project's dependencies, or `!uv pip install pydantic` to install `pydantic` into the project's virtual environment without persisting the change to the project `pyproject.toml` or `uv.lock` files. Either command will make `import pydantic` work within the notebook. ### Installing packages without a kernel If you don't want to create a kernel, you can still install packages from within the notebook. However, there are a few caveats to consider. Though `uv run --with jupyter` runs in an isolated environment, within the notebook itself, `!uv add` and related commands will modify the _project's_ environment, even without a kernel. For example, running `!uv add pydantic` from within a notebook will add `pydantic` to the project's dependencies and virtual environment, such that `import pydantic` will work immediately, without further configuration or a server restart. However, since the Jupyter server is the "active" environment, `!uv pip install` will install package's into _Jupyter's_ environment, not the project environment. Such dependencies will persist for the lifetime of the Jupyter server, but may disappear on subsequent `jupyter` invocations. If you're working with a notebook that relies on pip (e.g., via the `%pip` magic), you can include pip in your project's virtual environment by running `uv venv --seed` prior to starting the Jupyter server. For example, given: ```console $ uv venv --seed $ uv run --with jupyter jupyter lab ``` Subsequent `%pip install` invocations within the notebook will install packages into the project's virtual environment. However, such modifications will _not_ be reflected in the project's `pyproject.toml` or `uv.lock` files. ## Using Jupyter as a standalone tool If you ever need ad hoc access to a notebook (i.e., to run a Python snippet interactively), you can start a Jupyter server at any time with `uv tool run jupyter lab`. This will run a Jupyter server in an isolated environment. ## Using Jupyter with a non-project environment If you need to run Jupyter in a virtual environment that isn't associated with a [project](../../concepts/projects/index.md) (e.g., has no `pyproject.toml` or `uv.lock`), you can do so by adding Jupyter to the environment directly. For example: === "macOS and Linux" ```console $ uv venv --seed $ uv pip install pydantic $ uv pip install jupyterlab $ .venv/bin/jupyter lab ``` === "Windows" ```pwsh-session PS> uv venv --seed PS> uv pip install pydantic PS> uv pip install jupyterlab PS> .venv\Scripts\jupyter lab ``` From here, `import pydantic` will work within the notebook, and you can install additional packages via `!uv pip install`, or even `!pip install`. ## Using Jupyter from VS Code You can also engage with Jupyter notebooks from within an editor like VS Code. To connect a uv-managed project to a Jupyter notebook within VS Code, we recommend creating a kernel for the project, as in the following: ```console # Create a project. $ uv init project # Move into the project directory. $ cd project # Add ipykernel as a dev dependency. $ uv add --dev ipykernel # Open the project in VS Code. $ code . ``` Once the project directory is open in VS Code, you can create a new Jupyter notebook by selecting "Create: New Jupyter Notebook" from the command palette. When prompted to select a kernel, choose "Python Environments" and select the virtual environment you created earlier (e.g., `.venv/bin/python` on macOS and Linux, or `.venv\Scripts\python` on Windows). !!! note VS Code requires `ipykernel` to be present in the project environment. If you'd prefer to avoid adding `ipykernel` as a dev dependency, you can install it directly into the project environment with `uv pip install ipykernel`. If you need to manipulate the project's environment from within the notebook, you may need to add `uv` as an explicit development dependency: ```console $ uv add --dev uv ``` From there, you can use `!uv add pydantic` to add `pydantic` to the project's dependencies, or `!uv pip install pydantic` to install `pydantic` into the project's virtual environment without updating the project's `pyproject.toml` or `uv.lock` files. --- title: Using uv with marimo description: A complete guide to using uv with marimo notebooks for interactive computing, script execution, and data apps. --- # Using uv with marimo [marimo](https://github.com/marimo-team/marimo) is an open-source Python notebook that blends interactive computing with the reproducibility and reusability of traditional software, letting you version with Git, run as scripts, and share as apps. Because marimo notebooks are stored as pure Python scripts, they are able to integrate tightly with uv. You can readily use marimo as a standalone tool, as self-contained scripts, in projects, and in non-project environments. ## Using marimo as a standalone tool For ad-hoc access to marimo notebooks, start a marimo server at any time in an isolated environment with: ```console $ uvx marimo edit ``` Start a specific notebook with: ```console $ uvx marimo edit my_notebook.py ``` ## Using marimo with inline script metadata Because marimo notebooks are stored as Python scripts, they can encapsulate their own dependencies using inline script metadata, via uv's [support for scripts](../../guides/scripts.md). For example, to add `numpy` as a dependency to your notebook, use this command: ```console $ uv add --script my_notebook.py numpy ``` To interactively edit a notebook containing inline script metadata, use: ```console $ uvx marimo edit --sandbox my_notebook.py ``` marimo will automatically use uv to start your notebook in an isolated virtual environment with your script's dependencies. Packages installed from the marimo UI will automatically be added to the notebook's script metadata. You can optionally run these notebooks as Python scripts, without opening an interactive session: ```console $ uv run my_notebook.py ``` ## Using marimo within a project If you're working within a [project](../../concepts/projects/index.md), you can start a marimo notebook with access to the project's virtual environment via the following command (assuming marimo is a project dependency): ```console $ uv run marimo edit my_notebook.py ``` To make additional packages available to your notebook, either add them to your project with `uv add`, or use marimo's built-in package installation UI, which will invoke `uv add` on your behalf. If marimo is not a project dependency, you can still run a notebook with the following command: ```console $ uv run --with marimo marimo edit my_notebook.py ``` This will let you import your project's modules while editing your notebook. However, packages installed via marimo's UI when running in this way will not be added to your project, and may disappear on subsequent marimo invocations. ## Using marimo in a non-project environment To run marimo in a virtual environment that isn't associated with a [project](../../concepts/projects/index.md), add marimo to the environment directly: ```console $ uv venv $ uv pip install numpy $ uv pip install marimo $ uv run marimo edit ``` From here, `import numpy` will work within the notebook, and marimo's UI installer will add packages to the environment with `uv pip install` on your behalf. ## Running marimo notebooks as scripts Regardless of how your dependencies are managed (with inline script metadata, within a project, or with a non-project environment), you can run marimo notebooks as scripts with: ```console $ uv run my_notebook.py ``` This executes your notebook as a Python script, without opening an interactive session in your browser. --- title: Using uv with pre-commit description: A guide to using uv with pre-commit to automatically update lock files, export requirements, and compile requirements files. --- # Using uv in pre-commit An official pre-commit hook is provided at [`astral-sh/uv-pre-commit`](https://github.com/astral-sh/uv-pre-commit). To use uv with pre-commit, add one of the following examples to the `repos` list in the `.pre-commit-config.yaml`. To make sure your `uv.lock` file is up to date even if your `pyproject.toml` file was changed: ```yaml title=".pre-commit-config.yaml" repos: - repo: https://github.com/astral-sh/uv-pre-commit # uv version. rev: 0.8.4 hooks: - id: uv-lock ``` To keep a `requirements.txt` file in sync with your `uv.lock` file: ```yaml title=".pre-commit-config.yaml" repos: - repo: https://github.com/astral-sh/uv-pre-commit # uv version. rev: 0.8.4 hooks: - id: uv-export ``` To compile requirements files: ```yaml title=".pre-commit-config.yaml" repos: - repo: https://github.com/astral-sh/uv-pre-commit # uv version. rev: 0.8.4 hooks: # Compile requirements - id: pip-compile args: [requirements.in, -o, requirements.txt] ``` To compile alternative requirements files, modify `args` and `files`: ```yaml title=".pre-commit-config.yaml" repos: - repo: https://github.com/astral-sh/uv-pre-commit # uv version. rev: 0.8.4 hooks: # Compile requirements - id: pip-compile args: [requirements-dev.in, -o, requirements-dev.txt] files: ^requirements-dev\.(in|txt)$ ``` To run the hook over multiple files at the same time, add additional entries: ```yaml title=".pre-commit-config.yaml" repos: - repo: https://github.com/astral-sh/uv-pre-commit # uv version. rev: 0.8.4 hooks: # Compile requirements - id: pip-compile name: pip-compile requirements.in args: [requirements.in, -o, requirements.txt] - id: pip-compile name: pip-compile requirements-dev.in args: [requirements-dev.in, -o, requirements-dev.txt] files: ^requirements-dev\.(in|txt)$ ``` --- title: Using uv with PyTorch description: A guide to using uv with PyTorch, including installing PyTorch, configuring per-platform and per-accelerator builds, and more. --- # Using uv with PyTorch The [PyTorch](https://pytorch.org/) ecosystem is a popular choice for deep learning research and development. You can use uv to manage PyTorch projects and PyTorch dependencies across different Python versions and environments, even controlling for the choice of accelerator (e.g., CPU-only vs. CUDA). !!! note Some of the features outlined in this guide require uv version 0.5.3 or later. We recommend upgrading prior to configuring PyTorch. ## Installing PyTorch From a packaging perspective, PyTorch has a few uncommon characteristics: - Many PyTorch wheels are hosted on a dedicated index, rather than the Python Package Index (PyPI). As such, installing PyTorch often requires configuring a project to use the PyTorch index. - PyTorch produces distinct builds for each accelerator (e.g., CPU-only, CUDA). Since there's no standardized mechanism for specifying these accelerators when publishing or installing, PyTorch encodes them in the local version specifier. As such, PyTorch versions will often look like `2.5.1+cpu`, `2.5.1+cu121`, etc. - Builds for different accelerators are published to different indexes. For example, the `+cpu` builds are published on https://download.pytorch.org/whl/cpu, while the `+cu121` builds are published on https://download.pytorch.org/whl/cu121. As such, the necessary packaging configuration will vary depending on both the platforms you need to support and the accelerators you want to enable. To start, consider the following (default) configuration, which would be generated by running `uv init --python 3.12` followed by `uv add torch torchvision`. In this case, PyTorch would be installed from PyPI, which hosts CPU-only wheels for Windows and macOS, and GPU-accelerated wheels on Linux (targeting CUDA 12.6): ```toml [project] name = "project" version = "0.1.0" requires-python = ">=3.12" dependencies = [ "torch>=2.7.0", "torchvision>=0.22.0", ] ``` !!! tip "Supported Python versions" At time of writing, PyTorch does not yet publish wheels for Python 3.14; as such projects with `requires-python = ">=3.14"` may fail to resolve. See the [compatibility matrix](https://github.com/pytorch/pytorch/blob/main/RELEASE.md#release-compatibility-matrix). This is a valid configuration for projects that want to use CPU builds on Windows and macOS, and CUDA-enabled builds on Linux. However, if you need to support different platforms or accelerators, you'll need to configure the project accordingly. ## Using a PyTorch index In some cases, you may want to use a specific PyTorch variant across all platforms. For example, you may want to use the CPU-only builds on Linux too. In such cases, the first step is to add the relevant PyTorch index to your `pyproject.toml`: === "CPU-only" ```toml [[tool.uv.index]] name = "pytorch-cpu" url = "https://download.pytorch.org/whl/cpu" explicit = true ``` === "CUDA 11.8" ```toml [[tool.uv.index]] name = "pytorch-cu118" url = "https://download.pytorch.org/whl/cu118" explicit = true ``` === "CUDA 12.6" ```toml [[tool.uv.index]] name = "pytorch-cu126" url = "https://download.pytorch.org/whl/cu126" explicit = true ``` === "CUDA 12.8" ```toml [[tool.uv.index]] name = "pytorch-cu128" url = "https://download.pytorch.org/whl/cu128" explicit = true ``` === "ROCm6" ```toml [[tool.uv.index]] name = "pytorch-rocm" url = "https://download.pytorch.org/whl/rocm6.3" explicit = true ``` === "Intel GPUs" ```toml [[tool.uv.index]] name = "pytorch-xpu" url = "https://download.pytorch.org/whl/xpu" explicit = true ``` We recommend the use of `explicit = true` to ensure that the index is _only_ used for `torch`, `torchvision`, and other PyTorch-related packages, as opposed to generic dependencies like `jinja2`, which should continue to be sourced from the default index (PyPI). Next, update the `pyproject.toml` to point `torch` and `torchvision` to the desired index: === "CPU-only" ```toml [tool.uv.sources] torch = [ { index = "pytorch-cpu" }, ] torchvision = [ { index = "pytorch-cpu" }, ] ``` === "CUDA 11.8" PyTorch doesn't publish CUDA builds for macOS. As such, we gate on `sys_platform` to instruct uv to use the PyTorch index on Linux and Windows, but fall back to PyPI on macOS: ```toml [tool.uv.sources] torch = [ { index = "pytorch-cu118", marker = "sys_platform == 'linux' or sys_platform == 'win32'" }, ] torchvision = [ { index = "pytorch-cu118", marker = "sys_platform == 'linux' or sys_platform == 'win32'" }, ] ``` === "CUDA 12.6" PyTorch doesn't publish CUDA builds for macOS. As such, we gate on `sys_platform` to instruct uv to limit the PyTorch index to Linux and Windows, falling back to PyPI on macOS: ```toml [tool.uv.sources] torch = [ { index = "pytorch-cu126", marker = "sys_platform == 'linux' or sys_platform == 'win32'" }, ] torchvision = [ { index = "pytorch-cu126", marker = "sys_platform == 'linux' or sys_platform == 'win32'" }, ] ``` === "CUDA 12.8" PyTorch doesn't publish CUDA builds for macOS. As such, we gate on `sys_platform` to instruct uv to limit the PyTorch index to Linux and Windows, falling back to PyPI on macOS: ```toml [tool.uv.sources] torch = [ { index = "pytorch-cu128", marker = "sys_platform == 'linux' or sys_platform == 'win32'" }, ] torchvision = [ { index = "pytorch-cu128", marker = "sys_platform == 'linux' or sys_platform == 'win32'" }, ] ``` === "ROCm6" PyTorch doesn't publish ROCm6 builds for macOS or Windows. As such, we gate on `sys_platform` to instruct uv to limit the PyTorch index to Linux, falling back to PyPI on macOS and Windows: ```toml [tool.uv.sources] torch = [ { index = "pytorch-rocm", marker = "sys_platform == 'linux'" }, ] torchvision = [ { index = "pytorch-rocm", marker = "sys_platform == 'linux'" }, ] # ROCm6 support relies on `pytorch-triton-rocm`, which should also be installed from the PyTorch index # (and included in `project.dependencies`). pytorch-triton-rocm = [ { index = "pytorch-rocm", marker = "sys_platform == 'linux'" }, ] ``` === "Intel GPUs" PyTorch doesn't publish Intel GPU builds for macOS. As such, we gate on `sys_platform` to instruct uv to limit the PyTorch index to Linux and Windows, falling back to PyPI on macOS: ```toml [tool.uv.sources] torch = [ { index = "pytorch-xpu", marker = "sys_platform == 'linux' or sys_platform == 'win32'" }, ] torchvision = [ { index = "pytorch-xpu", marker = "sys_platform == 'linux' or sys_platform == 'win32'" }, ] # Intel GPU support relies on `pytorch-triton-xpu`, which should also be installed from the PyTorch index # (and included in `project.dependencies`). pytorch-triton-xpu = [ { index = "pytorch-xpu", marker = "sys_platform == 'linux' or sys_platform == 'win32'" }, ] ``` As a complete example, the following project would use PyTorch's CPU-only builds on all platforms: ```toml [project] name = "project" version = "0.1.0" requires-python = ">=3.12.0" dependencies = [ "torch>=2.7.0", "torchvision>=0.22.0", ] [tool.uv.sources] torch = [ { index = "pytorch-cpu" }, ] torchvision = [ { index = "pytorch-cpu" }, ] [[tool.uv.index]] name = "pytorch-cpu" url = "https://download.pytorch.org/whl/cpu" explicit = true ``` ## Configuring accelerators with environment markers In some cases, you may want to use CPU-only builds in one environment (e.g., macOS and Windows), and CUDA-enabled builds in another (e.g., Linux). With `tool.uv.sources`, you can use environment markers to specify the desired index for each platform. For example, the following configuration would use PyTorch's CUDA-enabled builds on Linux, and CPU-only builds on all other platforms (e.g., macOS and Windows): ```toml [project] name = "project" version = "0.1.0" requires-python = ">=3.12.0" dependencies = [ "torch>=2.7.0", "torchvision>=0.22.0", ] [tool.uv.sources] torch = [ { index = "pytorch-cpu", marker = "sys_platform != 'linux'" }, { index = "pytorch-cu128", marker = "sys_platform == 'linux'" }, ] torchvision = [ { index = "pytorch-cpu", marker = "sys_platform != 'linux'" }, { index = "pytorch-cu128", marker = "sys_platform == 'linux'" }, ] [[tool.uv.index]] name = "pytorch-cpu" url = "https://download.pytorch.org/whl/cpu" explicit = true [[tool.uv.index]] name = "pytorch-cu128" url = "https://download.pytorch.org/whl/cu128" explicit = true ``` Similarly, the following configuration would use PyTorch's AMD GPU builds on Linux, and CPU-only builds on Windows and macOS (by way of falling back to PyPI): ```toml [project] name = "project" version = "0.1.0" requires-python = ">=3.12.0" dependencies = [ "torch>=2.7.0", "torchvision>=0.22.0", "pytorch-triton-rocm>=3.3.0 ; sys_platform == 'linux'", ] [tool.uv.sources] torch = [ { index = "pytorch-rocm", marker = "sys_platform == 'linux'" }, ] torchvision = [ { index = "pytorch-rocm", marker = "sys_platform == 'linux'" }, ] pytorch-triton-rocm = [ { index = "pytorch-rocm", marker = "sys_platform == 'linux'" }, ] [[tool.uv.index]] name = "pytorch-rocm" url = "https://download.pytorch.org/whl/rocm6.3" explicit = true ``` Or, for Intel GPU builds: ```toml [project] name = "project" version = "0.1.0" requires-python = ">=3.12.0" dependencies = [ "torch>=2.7.0", "torchvision>=0.22.0", "pytorch-triton-xpu>=3.3.0 ; sys_platform == 'win32' or sys_platform == 'linux'", ] [tool.uv.sources] torch = [ { index = "pytorch-xpu", marker = "sys_platform == 'win32' or sys_platform == 'linux'" }, ] torchvision = [ { index = "pytorch-xpu", marker = "sys_platform == 'win32' or sys_platform == 'linux'" }, ] pytorch-triton-xpu = [ { index = "pytorch-xpu", marker = "sys_platform == 'win32' or sys_platform == 'linux'" }, ] [[tool.uv.index]] name = "pytorch-xpu" url = "https://download.pytorch.org/whl/xpu" explicit = true ``` ## Configuring accelerators with optional dependencies In some cases, you may want to use CPU-only builds in some cases, but CUDA-enabled builds in others, with the choice toggled by a user-provided extra (e.g., `uv sync --extra cpu` vs. `uv sync --extra cu128`). With `tool.uv.sources`, you can use extra markers to specify the desired index for each enabled extra. For example, the following configuration would use PyTorch's CPU-only for `uv sync --extra cpu` and CUDA-enabled builds for `uv sync --extra cu128`: ```toml [project] name = "project" version = "0.1.0" requires-python = ">=3.12.0" dependencies = [] [project.optional-dependencies] cpu = [ "torch>=2.7.0", "torchvision>=0.22.0", ] cu128 = [ "torch>=2.7.0", "torchvision>=0.22.0", ] [tool.uv] conflicts = [ [ { extra = "cpu" }, { extra = "cu128" }, ], ] [tool.uv.sources] torch = [ { index = "pytorch-cpu", extra = "cpu" }, { index = "pytorch-cu128", extra = "cu128" }, ] torchvision = [ { index = "pytorch-cpu", extra = "cpu" }, { index = "pytorch-cu128", extra = "cu128" }, ] [[tool.uv.index]] name = "pytorch-cpu" url = "https://download.pytorch.org/whl/cpu" explicit = true [[tool.uv.index]] name = "pytorch-cu128" url = "https://download.pytorch.org/whl/cu128" explicit = true ``` !!! note Since GPU-accelerated builds aren't available on macOS, the above configuration will fail to install on macOS when the `cu128` extra is enabled. ## The `uv pip` interface While the above examples are focused on uv's project interface (`uv lock`, `uv sync`, `uv run`, etc.), PyTorch can also be installed via the `uv pip` interface. PyTorch itself offers a [dedicated interface](https://pytorch.org/get-started/locally/) to determine the appropriate pip command to run for a given target configuration. For example, you can install stable, CPU-only PyTorch on Linux with: ```shell $ pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cpu ``` To use the same workflow with uv, replace `pip3` with `uv pip`: ```shell $ uv pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cpu ``` ## Automatic backend selection uv supports automatic selection of the appropriate PyTorch index via the `--torch-backend=auto` command-line argument (or the `UV_TORCH_BACKEND=auto` environment variable), as in: ```shell $ # With a command-line argument. $ uv pip install torch --torch-backend=auto $ # With an environment variable. $ UV_TORCH_BACKEND=auto uv pip install torch ``` When enabled, uv will query for the installed CUDA driver, AMD GPU versions, and Intel GPU presence, then use the most-compatible PyTorch index for all relevant packages (e.g., `torch`, `torchvision`, etc.). If no such GPU is found, uv will fall back to the CPU-only index. uv will continue to respect existing index configuration for any packages outside the PyTorch ecosystem. You can also select a specific backend (e.g., CUDA 12.6) with `--torch-backend=cu126` (or `UV_TORCH_BACKEND=cu126`): ```shell $ # With a command-line argument. $ uv pip install torch torchvision --torch-backend=cu126 $ # With an environment variable. $ UV_TORCH_BACKEND=cu126 uv pip install torch torchvision ``` On Windows, Intel GPU (XPU) is not automatically selected with `--torch-backend=auto`, but you can manually specify it using `--torch-backend=xpu`: ```shell $ # Manual selection for Intel GPU. $ uv pip install torch torchvision --torch-backend=xpu ``` At present, `--torch-backend` is only available in the `uv pip` interface. # Migration guides Learn how to migrate from other tools to uv: - [Migrate from pip to uv projects](./pip-to-project.md) !!! note Other guides, such as migrating from another project management tool, or from pip to `uv pip` are not yet available. See [#5200](https://github.com/astral-sh/uv/issues/5200) to track progress. Or, explore the [integration guides](../integration/index.md) to learn how to use uv with other software. # Migrating from pip to a uv project This guide will discuss converting from a `pip` and `pip-tools` workflow centered on `requirements` files to uv's project workflow using a `pyproject.toml` and `uv.lock` file. !!! note If you're looking to migrate from `pip` and `pip-tools` to uv's drop-in interface or from an existing workflow where you're already using a `pyproject.toml`, those guides are not yet written. See [#5200](https://github.com/astral-sh/uv/issues/5200) to track progress. We'll start with an overview of developing with `pip`, then discuss migrating to uv. !!! tip If you're familiar with the ecosystem, you can jump ahead to the [requirements file import](#importing-requirements-files) instructions. ## Understanding pip workflows ### Project dependencies When you want to use a package in your project, you need to install it first. `pip` supports imperative installation of packages, e.g.: ```console $ pip install fastapi ``` This installs the package into the environment that `pip` is installed in. This may be a virtual environment, or, the global environment of your system's Python installation. Then, you can run a Python script that requires the package: ```python title="example.py" import fastapi ``` It's best practice to create a virtual environment for each project, to avoid mixing packages between them. For example: ```console $ python -m venv $ source .venv/bin/activate $ pip ... ``` We will revisit this topic in the [project environments section](#project-environments) below. ### Requirements files When sharing projects with others, it's useful to declare all the packages you require upfront. `pip` supports installing requirements from a file, e.g.: ```python title="requirements.txt" fastapi ``` ```console $ pip install -r requirements.txt ``` Notice above that `fastapi` is not "locked" to a specific version — each person working on the project may have a different version of `fastapi` installed. `pip-tools` was created to improve this experience. When using `pip-tools`, requirements files specify both the dependencies for your project and lock dependencies to a specific version — the file extension is used to differentiate between the two. For example, if you require `fastapi` and `pydantic`, you'd specify these in a `requirements.in` file: ```python title="requirements.in" fastapi pydantic>2 ``` Notice there's a version constraint on `pydantic` — this means only `pydantic` versions later than `2.0.0` can be used. In contrast, `fastapi` does not have a version constraint — any version can be used. These dependencies can be compiled into a `requirements.txt` file: ```console $ pip-compile requirements.in -o requirements.txt ``` ```python title="requirements.txt" annotated-types==0.7.0 # via pydantic anyio==4.8.0 # via starlette fastapi==0.115.11 # via -r requirements.in idna==3.10 # via anyio pydantic==2.10.6 # via # -r requirements.in # fastapi pydantic-core==2.27.2 # via pydantic sniffio==1.3.1 # via anyio starlette==0.46.1 # via fastapi typing-extensions==4.12.2 # via # fastapi # pydantic # pydantic-core ``` Here, all the versions constraints are _exact_. Only a single version of each package can be used. The above example was generated with `uv pip compile`, but could also be generated with `pip-compile` from `pip-tools`. Though less common, the `requirements.txt` can also be generated using `pip freeze`, by first installing the input dependencies into the environment then exporting the installed versions: ```console $ pip install -r requirements.in $ pip freeze > requirements.txt ``` ```python title="requirements.txt" annotated-types==0.7.0 anyio==4.8.0 fastapi==0.115.11 idna==3.10 pydantic==2.10.6 pydantic-core==2.27.2 sniffio==1.3.1 starlette==0.46.1 typing-extensions==4.12.2 ``` After compiling dependencies into a locked set of versions, these files are committed to version control and distributed with the project. Then, when someone wants to use the project, they install from the requirements file: ```console $ pip install -r requirements.txt ``` ### Development dependencies The requirements file format can only describe a single set of dependencies at once. This means if you have additional _groups_ of dependencies, such as development dependencies, they need separate files. For example, we'll create a `-dev` dependency file: ```python title="requirements-dev.in" -r requirements.in -c requirements.txt pytest ``` Notice the base requirements are included with `-r requirements.in`. This ensures your development environment considers _all_ of the dependencies together. The `-c requirements.txt` _constrains_ the package version to ensure that the `requirements-dev.txt` uses the same versions as `requirements.txt`. !!! note It's common to use `-r requirements.txt` directly instead of using both `-r requirements.in`, and `-c requirements.txt`. There's no difference in the resulting package versions, but using both files produces annotations which allow you to determine which dependencies are _direct_ (annotated with `-r requirements.in`) and which are _indirect_ (only annotated with `-c requirements.txt`). The compiled development dependencies look like: ```python title="requirements-dev.txt" annotated-types==0.7.0 # via # -c requirements.txt # pydantic anyio==4.8.0 # via # -c requirements.txt # starlette fastapi==0.115.11 # via # -c requirements.txt # -r requirements.in idna==3.10 # via # -c requirements.txt # anyio iniconfig==2.0.0 # via pytest packaging==24.2 # via pytest pluggy==1.5.0 # via pytest pydantic==2.10.6 # via # -c requirements.txt # -r requirements.in # fastapi pydantic-core==2.27.2 # via # -c requirements.txt # pydantic pytest==8.3.5 # via -r requirements-dev.in sniffio==1.3.1 # via # -c requirements.txt # anyio starlette==0.46.1 # via # -c requirements.txt # fastapi typing-extensions==4.12.2 # via # -c requirements.txt # fastapi # pydantic # pydantic-core ``` As with the base dependency files, these are committed to version control and distributed with the project. When someone wants to work on the project, they'll install from the requirements file: ```console $ pip install -r requirements-dev.txt ``` ### Platform-specific dependencies When compiling dependencies with `pip` or `pip-tools`, the result is only usable on the same platform as it is generated on. This poses a problem for projects which need to be usable on multiple platforms, such as Windows and macOS. For example, take a simple dependency: ```python title="requirements.in" tqdm ``` On Linux, this compiles to: ```python title="requirements-linux.txt" tqdm==4.67.1 # via -r requirements.in ``` While on Windows, this compiles to: ```python title="requirements-win.txt" colorama==0.4.6 # via tqdm tqdm==4.67.1 # via -r requirements.in ``` `colorama` is a Windows-only dependency of `tqdm`. When using `pip` and `pip-tools`, a project needs to declare a requirements lock file for each supported platform. !!! note uv's resolver can compile dependencies for multiple platforms at once (see ["universal resolution"](../../concepts/resolution.md#universal-resolution)), allowing you to use a single `requirements.txt` for all platforms: ```console $ uv pip compile --universal requirements.in ``` ```python title="requirements.txt" colorama==0.4.6 ; sys_platform == 'win32' # via tqdm tqdm==4.67.1 # via -r requirements.in ``` This resolution mode is also used when using a `pyproject.toml` and `uv.lock`. ## Migrating to a uv project ### The `pyproject.toml` The `pyproject.toml` is a standardized file for Python project metadata. It replaces `requirements.in` files, allowing you to represent arbitrary groups of project dependencies. It also provides a centralized location for metadata about your project, such as the build system or tool settings. For example, the `requirements.in` and `requirements-dev.in` files above can be translated to a `pyproject.toml` as follows: ```toml title="pyproject.toml" [project] name = "example" version = "0.0.1" dependencies = [ "fastapi", "pydantic>2" ] [dependency-groups] dev = ["pytest"] ``` We'll discuss the commands necessary to automate these imports below. ### The uv lockfile uv uses a lockfile (`uv.lock`) file to lock package versions. The format of this file is specific to uv, allowing uv to support advanced features. It replaces `requirements.txt` files. The lockfile will be automatically created and populated when adding dependencies, but you can explicitly create it with `uv lock`. Unlike `requirements.txt` files, the `uv.lock` file can represent arbitrary groups of dependencies, so multiple files are not needed to lock development dependencies. The uv lockfile is always [universal](../../concepts/resolution.md#universal-resolution), so multiple files are not needed to [lock dependencies for each platform](#platform-specific-dependencies). This ensures that all developers are using consistent, locked versions of dependencies regardless of their machine. The uv lockfile also supports concepts like [pinning packages to specific indexes](../../concepts/indexes.md#pinning-a-package-to-an-index), which is not representable in `requirements.txt` files. !!! tip If you only need to lock for a subset of platforms, use the [`tool.uv.environments`](../../concepts/resolution.md#limited-resolution-environments) setting to limit the resolution and lockfile. To learn more, see the [lockfile](../../concepts/projects/layout.md#the-lockfile) documentation. ### Importing requirements files First, create a `pyproject.toml` if you have not already: ```console $ uv init ``` Then, the easiest way to import requirements is with `uv add`: ```console $ uv add -r requirements.in ``` However, there is some nuance to this transition. Notice we used the `requirements.in` file, which does not pin to exact versions of packages so uv will solve for new versions of these packages. You may want to continue using your previously locked versions from your `requirements.txt` so, when switching over to uv, none of your dependency versions change. The solution is to add your locked versions as _constraints_. uv supports using these on `add` to preserve locked versions: ```console $ uv add -r requirements.in -c requirements.txt ``` Your existing versions will be retained when producing a `uv.lock` file. #### Importing platform-specific constraints If your platform-specific dependencies have been compiled into separate files, you can still transition to a universal lockfile. However, you cannot just use `-c` to specify constraints from your existing platform-specific `requirements.txt` files because they do not include markers describing the environment and will consequently conflict. To add the necessary markers, use `uv pip compile` to convert your existing files. For example, given the following: ```python title="requirements-win.txt" colorama==0.4.6 # via tqdm tqdm==4.67.1 # via -r requirements.in ``` The markers can be added with: ```console $ uv pip compile requirements.in -o requirements-win.txt --python-platform windows --no-strip-markers ``` Notice the resulting output includes a Windows marker on `colorama`: ```python title="requirements-win.txt" colorama==0.4.6 ; sys_platform == 'win32' # via tqdm tqdm==4.67.1 # via -r requirements.in ``` When using `-o`, uv will constrain the versions to match the existing output file, if it can. Markers can be added for other platforms by changing the `--python-platform` and `-o` values for each requirements file you need to import, e.g., to `linux` and `macos`. Once each `requirements.txt` file has been transformed, the dependencies can be imported to the `pyproject.toml` and `uv.lock` with `uv add`: ```console $ uv add -r requirements.in -c requirements-win.txt -c requirements-linux.txt ``` #### Importing development dependency files As discussed in the [development dependencies](#development-dependencies) section, it's common to have groups of dependencies for development purposes. To import development dependencies, use the `--dev` flag during `uv add`: ```console $ uv add --dev -r requirements-dev.in -c requirements-dev.txt ``` If the `requirements-dev.in` includes the parent `requirements.in` via `-r`, it will need to be stripped to avoid adding the base requirements to the `dev` dependency group. The following example uses `sed` to strip lines that start with `-r`, then pipes the result to `uv add`: ```console $ sed '/^-r /d' requirements-dev.in | uv add --dev -r - -c requirements-dev.txt ``` In addition to the `dev` dependency group, uv supports arbitrary group names. For example, if you also have a dedicated set of dependencies for building your documentation, those can be imported to a `docs` group: ```console $ uv add -r requirements-docs.in -c requirements-docs.txt --group docs ``` ### Project environments Unlike `pip`, uv is not centered around the concept of an "active" virtual environment. Instead, uv uses a dedicated virtual environment for each project in a `.venv` directory. This environment is automatically managed, so when you run a command, like `uv add`, the environment is synced with the project dependencies. The preferred way to execute commands in the environment is with `uv run`, e.g.: ```console $ uv run pytest ``` Prior to every `uv run` invocation, uv will verify that the lockfile is up-to-date with the `pyproject.toml`, and that the environment is up-to-date with the lockfile, keeping your project in-sync without the need for manual intervention. `uv run` guarantees that your command is run in a consistent, locked environment. The project environment can also be explicitly created with `uv sync`, e.g., for use with editors. !!! note When in projects, uv will prefer a `.venv` in the project directory and ignore the active environment as declared by the `VIRTUAL_ENV` variable by default. You can opt-in to using the active environment with the `--active` flag. To learn more, see the [project environment](../../concepts/projects/layout.md#the-project-environment) documentation. ## Next steps Now that you've migrated to uv, take a look at the [project concept](../../concepts/projects/index.md) page for more details about uv projects. --- title: Building and publishing a package description: A guide to using uv to build and publish Python packages to a package index, like PyPI. --- # Building and publishing a package uv supports building Python packages into source and binary distributions via `uv build` and uploading them to a registry with `uv publish`. ## Preparing your project for packaging Before attempting to publish your project, you'll want to make sure it's ready to be packaged for distribution. If your project does not include a `[build-system]` definition in the `pyproject.toml`, uv will not build it by default. This means that your project may not be ready for distribution. Read more about the effect of declaring a build system in the [project concept](../concepts/projects/config.md#build-systems) documentation. !!! note If you have internal packages that you do not want to be published, you can mark them as private: ```toml [project] classifiers = ["Private :: Do Not Upload"] ``` This setting makes PyPI reject your uploaded package from publishing. It does not affect security or privacy settings on alternative registries. We also recommend only generating [per-project PyPI API tokens](https://pypi.org/help/#apitoken): Without a PyPI token matching the project, it can't be accidentally published. ## Building your package Build your package with `uv build`: ```console $ uv build ``` By default, `uv build` will build the project in the current directory, and place the built artifacts in a `dist/` subdirectory. Alternatively, `uv build ` will build the package in the specified directory, while `uv build --package ` will build the specified package within the current workspace. !!! info By default, `uv build` respects `tool.uv.sources` when resolving build dependencies from the `build-system.requires` section of the `pyproject.toml`. When publishing a package, we recommend running `uv build --no-sources` to ensure that the package builds correctly when `tool.uv.sources` is disabled, as is the case when using other build tools, like [`pypa/build`](https://github.com/pypa/build). ## Updating your version The `uv version` command provides conveniences for updating the version of your package before you publish it. [See the project docs for reading your package's version](./projects.md#managing-version). To update to an exact version, provide it as a positional argument: ```console $ uv version 1.0.0 hello-world 0.7.0 => 1.0.0 ``` To preview the change without updating the `pyproject.toml`, use the `--dry-run` flag: ```console $ uv version 2.0.0 --dry-run hello-world 1.0.0 => 2.0.0 $ uv version hello-world 1.0.0 ``` To increase the version of your package semantics, use the `--bump` option: ```console $ uv version --bump minor hello-world 1.2.3 => 1.3.0 ``` The `--bump` option supports the following common version components: `major`, `minor`, `patch`, `stable`, `alpha`, `beta`, `rc`, `post`, and `dev`. When provided more than once, the components will be applied in order, from largest (`major`) to smallest (`dev`). To move from a stable to pre-release version, bump one of the major, minor, or patch components in addition to the pre-release component: ```console $ uv version --bump patch --bump beta hello-world 1.3.0 => 1.3.1b1 $ uv version --bump major --bump alpha hello-world 1.3.0 => 2.0.0a1 ``` When moving from a pre-release to a new pre-release version, just bump the relevant pre-release component: ```console uv version --bump beta hello-world 1.3.0b1 => 1.3.1b2 ``` When moving from a pre-release to a stable version, the `stable` option can be used to clear the pre-release component: ```console uv version --bump stable hello-world 1.3.1b2 => 1.3.1 ``` !!! info By default, when `uv version` modifies the project it will perform a lock and sync. To prevent locking and syncing, use `--frozen`, or, to just prevent syncing, use `--no-sync`. ## Publishing your package Publish your package with `uv publish`: ```console $ uv publish ``` Set a PyPI token with `--token` or `UV_PUBLISH_TOKEN`, or set a username with `--username` or `UV_PUBLISH_USERNAME` and password with `--password` or `UV_PUBLISH_PASSWORD`. For publishing to PyPI from GitHub Actions, you don't need to set any credentials. Instead, [add a trusted publisher to the PyPI project](https://docs.pypi.org/trusted-publishers/adding-a-publisher/). !!! note PyPI does not support publishing with username and password anymore, instead you need to generate a token. Using a token is equivalent to setting `--username __token__` and using the token as password. If you're using a custom index through `[[tool.uv.index]]`, add `publish-url` and use `uv publish --index `. For example: ```toml [[tool.uv.index]] name = "testpypi" url = "https://test.pypi.org/simple/" publish-url = "https://test.pypi.org/legacy/" explicit = true ``` !!! note When using `uv publish --index `, the `pyproject.toml` must be present, i.e., you need to have a checkout step in a publish CI job. Even though `uv publish` retries failed uploads, it can happen that publishing fails in the middle, with some files uploaded and some files still missing. With PyPI, you can retry the exact same command, existing identical files will be ignored. With other registries, use `--check-url ` with the index URL (not the publishing URL) the packages belong to. When using `--index`, the index URL is used as check URL. uv will skip uploading files that are identical to files in the registry, and it will also handle raced parallel uploads. Note that existing files need to match exactly with those previously uploaded to the registry, this avoids accidentally publishing source distribution and wheels with different contents for the same version. ## Installing your package Test that the package can be installed and imported with `uv run`: ```console $ uv run --with --no-project -- python -c "import " ``` The `--no-project` flag is used to avoid installing the package from your local project directory. !!! tip If you have recently installed the package, you may need to include the `--refresh-package ` option to avoid using a cached version of the package. ## Next steps To learn more about publishing packages, check out the [PyPA guides](https://packaging.python.org/en/latest/guides/section-build-and-publish/) on building and publishing. Or, read on for [guides](./integration/index.md) on integrating uv with other software. --- title: Working on projects description: A guide to using uv to create and manage Python projects, including adding dependencies, running commands, and building publishable distributions. --- # Working on projects uv supports managing Python projects, which define their dependencies in a `pyproject.toml` file. ## Creating a new project You can create a new Python project using the `uv init` command: ```console $ uv init hello-world $ cd hello-world ``` Alternatively, you can initialize a project in the working directory: ```console $ mkdir hello-world $ cd hello-world $ uv init ``` uv will create the following files: ```text ├── .gitignore ├── .python-version ├── README.md ├── main.py └── pyproject.toml ``` The `main.py` file contains a simple "Hello world" program. Try it out with `uv run`: ```console $ uv run main.py Hello from hello-world! ``` ## Project structure A project consists of a few important parts that work together and allow uv to manage your project. In addition to the files created by `uv init`, uv will create a virtual environment and `uv.lock` file in the root of your project the first time you run a project command, i.e., `uv run`, `uv sync`, or `uv lock`. A complete listing would look like: ```text . ├── .venv │   ├── bin │   ├── lib │   └── pyvenv.cfg ├── .python-version ├── README.md ├── main.py ├── pyproject.toml └── uv.lock ``` ### `pyproject.toml` The `pyproject.toml` contains metadata about your project: ```toml title="pyproject.toml" [project] name = "hello-world" version = "0.1.0" description = "Add your description here" readme = "README.md" dependencies = [] ``` You'll use this file to specify dependencies, as well as details about the project such as its description or license. You can edit this file manually, or use commands like `uv add` and `uv remove` to manage your project from the terminal. !!! tip See the official [`pyproject.toml` guide](https://packaging.python.org/en/latest/guides/writing-pyproject-toml/) for more details on getting started with the `pyproject.toml` format. You'll also use this file to specify uv [configuration options](../concepts/configuration-files.md) in a [`[tool.uv]`](../reference/settings.md) section. ### `.python-version` The `.python-version` file contains the project's default Python version. This file tells uv which Python version to use when creating the project's virtual environment. ### `.venv` The `.venv` folder contains your project's virtual environment, a Python environment that is isolated from the rest of your system. This is where uv will install your project's dependencies. See the [project environment](../concepts/projects/layout.md#the-project-environment) documentation for more details. ### `uv.lock` `uv.lock` is a cross-platform lockfile that contains exact information about your project's dependencies. Unlike the `pyproject.toml` which is used to specify the broad requirements of your project, the lockfile contains the exact resolved versions that are installed in the project environment. This file should be checked into version control, allowing for consistent and reproducible installations across machines. `uv.lock` is a human-readable TOML file but is managed by uv and should not be edited manually. See the [lockfile](../concepts/projects/layout.md#the-lockfile) documentation for more details. ## Managing dependencies You can add dependencies to your `pyproject.toml` with the `uv add` command. This will also update the lockfile and project environment: ```console $ uv add requests ``` You can also specify version constraints or alternative sources: ```console $ # Specify a version constraint $ uv add 'requests==2.31.0' $ # Add a git dependency $ uv add git+https://github.com/psf/requests ``` If you're migrating from a `requirements.txt` file, you can use `uv add` with the `-r` flag to add all dependencies from the file: ```console $ # Add all dependencies from `requirements.txt`. $ uv add -r requirements.txt -c constraints.txt ``` To remove a package, you can use `uv remove`: ```console $ uv remove requests ``` To upgrade a package, run `uv lock` with the `--upgrade-package` flag: ```console $ uv lock --upgrade-package requests ``` The `--upgrade-package` flag will attempt to update the specified package to the latest compatible version, while keeping the rest of the lockfile intact. See the documentation on [managing dependencies](../concepts/projects/dependencies.md) for more details. ## Managing version The `uv version` command can be used to read your package's version. To get the version of your package, run `uv version`: ```console $ uv version hello-world 0.7.0 ``` To get the version without the package name, use the `--short` option: ```console $ uv version --short 0.7.0 ``` To get version information in a JSON format, use the `--output-format json` option: ```console $ uv version --output-format json { "package_name": "hello-world", "version": "0.7.0", "commit_info": null } ``` See the [publishing guide](./package.md#updating-your-version) for details on updating your package version. ## Running commands `uv run` can be used to run arbitrary scripts or commands in your project environment. Prior to every `uv run` invocation, uv will verify that the lockfile is up-to-date with the `pyproject.toml`, and that the environment is up-to-date with the lockfile, keeping your project in-sync without the need for manual intervention. `uv run` guarantees that your command is run in a consistent, locked environment. For example, to use `flask`: ```console $ uv add flask $ uv run -- flask run -p 3000 ``` Or, to run a script: ```python title="example.py" # Require a project dependency import flask print("hello world") ``` ```console $ uv run example.py ``` Alternatively, you can use `uv sync` to manually update the environment then activate it before executing a command: === "macOS and Linux" ```console $ uv sync $ source .venv/bin/activate $ flask run -p 3000 $ python example.py ``` === "Windows" ```pwsh-session PS> uv sync PS> .venv\Scripts\activate PS> flask run -p 3000 PS> python example.py ``` !!! note The virtual environment must be active to run scripts and commands in the project without `uv run`. Virtual environment activation differs per shell and platform. See the documentation on [running commands and scripts](../concepts/projects/run.md) in projects for more details. ## Building distributions `uv build` can be used to build source distributions and binary distributions (wheel) for your project. By default, `uv build` will build the project in the current directory, and place the built artifacts in a `dist/` subdirectory: ```console $ uv build $ ls dist/ hello-world-0.1.0-py3-none-any.whl hello-world-0.1.0.tar.gz ``` See the documentation on [building projects](../concepts/projects/build.md) for more details. ## Next steps To learn more about working on projects with uv, see the [projects concept](../concepts/projects/index.md) page and the [command reference](../reference/cli.md#uv). Or, read on to learn how to [build and publish your project to a package index](./package.md). --- title: Running scripts description: A guide to using uv to run Python scripts, including support for inline dependency metadata, reproducible scripts, and more. --- # Running scripts A Python script is a file intended for standalone execution, e.g., with `python