diff --git a/README.md b/README.md index 9cc66c62..e1affe8d 100644 --- a/README.md +++ b/README.md @@ -15,8 +15,9 @@ vLLM Hardware Plugin for IntelĀ® GaudiĀ® --- *Latest News* šŸ”„ +- [2025/12] Version 0.12.0 is now available, built on [vLLM 0.12.0](https://github.com/vllm-project/vllm/releases/tag/v0.12.0) and fully compatible with [IntelĀ® GaudiĀ® v1.23.0](https://docs.habana.ai/en/v1.23.0/Release_Notes/GAUDI_Release_Notes.html). + - [2025/11] The 0.11.2 release introduces the production-ready version of the vLLM Hardware Plugin for IntelĀ® GaudiĀ® v1.22.2. The plugin is an alternative to the [vLLM fork](https://github.com/HabanaAI/vllm-fork), which reaches end of life with this release and will be deprecated in v1.24.0, remaining functional only for legacy use cases. We strongly encourage all fork users to begin planning their migration to the plugin. For more information about this release, see the [Release Notes](docs/release_notes.md). -- [2025/06] We introduced an early developer preview of the vLLM Hardware Plugin for IntelĀ® GaudiĀ®, which is not yet intended for general use. --- diff --git a/docs/getting_started/installation.md b/docs/getting_started/installation.md index 6d737733..3fda7498 100644 --- a/docs/getting_started/installation.md +++ b/docs/getting_started/installation.md @@ -18,7 +18,7 @@ Before you start, ensure that your environment meets the following requirements: - Python 3.10 - IntelĀ® GaudiĀ® 2 or 3 AI accelerator -- IntelĀ® GaudiĀ® software version 1.21.0 or later +- IntelĀ® GaudiĀ® software version {{ VERSION }} or later Additionally, ensure that the Gaudi execution environment is properly set up. If it is not, complete the setup by using the [Gaudi Installation diff --git a/docs/getting_started/quickstart/quickstart.md b/docs/getting_started/quickstart/quickstart.md index d4be03bd..4313c78f 100644 --- a/docs/getting_started/quickstart/quickstart.md +++ b/docs/getting_started/quickstart/quickstart.md @@ -26,7 +26,7 @@ Before you start, ensure that your environment meets the following requirements: - Ubuntu 22.04 or 24.04 - Python 3.10 - IntelĀ® GaudiĀ® 2 or 3 AI accelerator -- IntelĀ® GaudiĀ® software version 1.21.0 or later +- IntelĀ® GaudiĀ® software version {{ VERSION }} or later Additionally, ensure that the IntelĀ® GaudiĀ® execution environment is properly set up. If it is not, complete the setup by following the [Installation diff --git a/docs/release_notes.md b/docs/release_notes.md index 4a3a5fc2..ad407cc3 100644 --- a/docs/release_notes.md +++ b/docs/release_notes.md @@ -2,6 +2,14 @@ This document provides an overview of the features, changes, and fixes introduced in each release of the vLLM Hardware Plugin for IntelĀ® GaudiĀ®. +## 0.12.0 + +This release upgraded the plugin to [vLLM 0.12.0](https://github.com/vllm-project/vllm/releases/tag/v0.12.0) and added support for [IntelĀ® GaudiĀ® v1.23.0](https://docs.habana.ai/en/v1.23.0/Release_Notes/GAUDI_Release_Notes.html). + +### Changes + +Upgraded PyTorch to version 2.9.0, which requires updating the environment. + ## 0.11.2 This version is based on [vLLM 0.11.2](https://github.com/vllm-project/vllm/releases/tag/v0.11.2) and supports [IntelĀ® GaudiĀ® v1.22.2](https://docs.habana.ai/en/v1.22.2/Release_Notes/GAUDI_Release_Notes.html).