vllm.envs ¶
__getattr__ ¶
__getattr__(name: str)
Gets environment variables lazily.
NOTE: After enable_envs_cache() invocation (which triggered after service initialization), all environment variables will be cached.
Source code in vllm/envs.py
compile_factors ¶
Return env vars used for torch.compile cache keys.
Start with every known vLLM env var; drop entries in ignored_factors; hash everything else. This keeps the cache key aligned across workers.
Source code in vllm/envs.py
1716 1717 1718 1719 1720 1721 1722 1723 1724 1725 1726 1727 1728 1729 1730 1731 1732 1733 1734 1735 1736 1737 1738 1739 1740 1741 1742 1743 1744 1745 1746 1747 1748 1749 1750 1751 1752 1753 1754 1755 1756 1757 1758 1759 1760 1761 1762 1763 1764 1765 1766 1767 1768 1769 1770 1771 1772 1773 1774 1775 1776 1777 1778 1779 1780 1781 1782 1783 1784 1785 1786 1787 1788 1789 1790 1791 1792 1793 1794 1795 1796 1797 1798 1799 1800 1801 1802 1803 1804 1805 1806 1807 1808 1809 1810 1811 1812 1813 1814 1815 1816 1817 1818 1819 1820 1821 1822 1823 1824 1825 | |
disable_envs_cache ¶
Resets the environment variables cache. It could be used to isolate environments between unit tests.
Source code in vllm/envs.py
enable_envs_cache ¶
Enables caching of environment variables. This is useful for performance reasons, as it avoids the need to re-evaluate environment variables on every call.
NOTE: Currently, it's invoked after service initialization to reduce runtime overhead. This also means that environment variables should NOT be updated after the service is initialized.
Source code in vllm/envs.py
env_list_with_choices ¶
env_list_with_choices(
env_name: str,
default: list[str],
choices: list[str] | Callable[[], list[str]],
case_sensitive: bool = True,
) -> Callable[[], list[str]]
Create a lambda that validates environment variable containing comma-separated values against allowed choices
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
env_name | str | Name of the environment variable | required |
default | list[str] | Default list of values if not set | required |
choices | list[str] | Callable[[], list[str]] | List of valid string options or callable that returns list | required |
case_sensitive | bool | Whether validation should be case sensitive | True |
Returns:
| Type | Description |
|---|---|
Callable[[], list[str]] | Lambda function for environment_variables |
Callable[[], list[str]] | dict that returns list of strings |
Source code in vllm/envs.py
env_set_with_choices ¶
env_set_with_choices(
env_name: str,
default: list[str],
choices: list[str] | Callable[[], list[str]],
case_sensitive: bool = True,
) -> Callable[[], set[str]]
Creates a lambda which that validates environment variable containing comma-separated values against allowed choices which returns choices as a set.
Source code in vllm/envs.py
env_with_choices ¶
env_with_choices(
env_name: str,
default: str | None,
choices: list[str] | Callable[[], list[str]],
case_sensitive: bool = True,
) -> Callable[[], str | None]
Create a lambda that validates environment variable against allowed choices
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
env_name | str | Name of the environment variable | required |
default | str | None | Default value if not set (can be None) | required |
choices | list[str] | Callable[[], list[str]] | List of valid string options or callable that returns list | required |
case_sensitive | bool | Whether validation should be case sensitive | True |
Returns:
| Type | Description |
|---|---|
Callable[[], str | None] | Lambda function for environment_variables dict |
Source code in vllm/envs.py
get_env_or_set_default ¶
Create a lambda that returns an environment variable value if set, or generates and sets a default value using the provided factory function.
Source code in vllm/envs.py
get_vllm_port ¶
get_vllm_port() -> int | None
Get the port from VLLM_PORT environment variable.
Returns:
| Type | Description |
|---|---|
int | None | The port number as an integer if VLLM_PORT is set, None otherwise. |
Raises:
| Type | Description |
|---|---|
ValueError | If VLLM_PORT is a URI, suggest k8s service discovery issue. |