Releases: deepset-ai/hayhooks
v1.4.0
Streaming updates
We've done some notable improvements on streaming:
- Better streaming performance on concurrent requests
- Support streaming from multiple components - docs
- Using
on_pipeline_endcallback - docs - Accessing pipeline intermediate outputs from streaming helpers - docs
- Support for hybrid streaming on async pipelines with sync-only components - docs
What's Changed
- Add support to automatic "hybrid" streaming by @mpangrazzi in #182
- Docs update by @mpangrazzi in #183
- chore: Use builtins for list and dict where possible by @sjrl in #186
Full Changelog: v1.3.0...v1.4.0
v1.3.0
What's Changed
- Add
include_outputs_fromsupport on pipelines execution by @mpangrazzi in #180 - Change Hayhooks docs branding by @bilgeyucel in #181
Full Changelog: v1.2.0...v1.3.0
v1.2.0
New features
We now support streaming from multiple components - See related docs for more details!
What's Changed
- Support streaming from multiple pipeline components by @mpangrazzi in #178
Full Changelog: v1.1.0...v1.2.0
v1.1.0
What's Changed
- Hayhooks documentation website by @mpangrazzi in #167
- fix docs to use
deepset/hayhooks:mainDocker image by @anakin87 in #169 - docs: add warning about OpenAIChatGenerator when using Open WebUI events by @anakin87 in #170
- chore: Remove email address from pyproject.toml by @julian-risch in #171
- Update haystack-ai dep version by @mpangrazzi in #172
- chore: Update license.md with additional legal information by @julian-risch in #173
- Ensure UTF-8 is properly supported by @mpangrazzi in #175
- Fix docs links in README.md by @sjrl in #177
- feat: When using
streaming_generatorreturn final pipeline result as a final streaming chunk by @sjrl in #162
New Contributors
- @julian-risch made their first contribution in #171
- @sjrl made their first contribution in #177
Full Changelog: v1.0.1...v1.1.0
v1.0.1
What's Changed
- Fix CLI command for upload files managing correctly file objs by @mpangrazzi in #165
Full Changelog: v1.0.0...v1.0.1
v1.0.0
New features
- We have rewritten YAML support for deploying YAML-serialized Haystack agents / pipelines.
Breaking changes
- We've removed
hayhooks pipeline deploycommand, in favour ofhayhooks pipeline deploy-files(forPipelineWrapper-based deployments) andhayhooks pipeline deploy-yamlfor deploying YAML serialized pipelines.
Extra
- We made linting and type checking more consistent with Haystack's one.
- We did some iteration on improving the Docker image (reduced size)
What's Changed
- Docker image improvements by @mpangrazzi in #153
- Docker image improvements by @mpangrazzi in #154
- Raise a
PipelineWrapperErrorifrun_apiorrun_api_asynchave no return type by @mpangrazzi in #152 - chore: adopt ruff rules from Haystack and enforce them by @anakin87 in #155
- chore: set required_variables in Prompt Builders to reduce noise in logs by @anakin87 in #158
- ci: make mypy run on PRs + fix some types by @anakin87 in #157
- Make mypy ignore errors in tests.* modules by @mpangrazzi in #160
- Breaking: Remove legacy
hayhooks pipeline deploy; introducehayhooks pipeline deploy-yamlto deploy YAML pipelines with required inputs/outputs fields by @mpangrazzi in #161
Full Changelog: v0.10.1...v1.0.0
v0.10.1
What's Changed
- chore: fix typos in README by @anakin87 in #147
- Fix RAG example by @mpangrazzi in #148
- Force
media_typeonFormto make Swagger usemultipart/form-datacontent-type by @mpangrazzi in #149 - Add a /status endpoint to MCP server by @mpangrazzi in #150
- Refactored imports to lower import time and speedup CLI by @mpangrazzi in #151
Full Changelog: v0.10.0...v0.10.1
v0.10.0
Agents support
Agents are now first class citizens on Hayhooks. Check how easy is to deploy an async Agent!
Support of open-webui events
Now you can trigger some open-webui events to make UX better when deploying pipelines or agents.
Check the full code example!
Added hooks for intercepting Tool calls
We've also added hooks to intercept tool calls (so you can send events also from there!).
Check the full code example!
Add example index
Now you can have a better view of existing examples here.
MCP
We've added the support of Streamable HTTP transport to Hayhooks MCP Server.
What's Changed
- Add Streamable HTTP MCP transport by @mpangrazzi in #128
- feat: Read hayhooks version locally from package metadata by @vblagoje in #127
- Better
Agentand open-webui events support by @mpangrazzi in #132 - Updated documentation for open-webui events integration by @mpangrazzi in #137
- chore: fix examples in README by @anakin87 in #138
- Add specific README for events example by @mpangrazzi in #141
Full Changelog: v0.9.1...v0.10.0
v0.9.1
What's Changed
- Rename HAYHOOKS_ADDITIONAL_PYTHONPATH to HAYHOOKS_ADDITIONAL_PYTHON_PATH by @mpangrazzi in #126
- Fix request model creation and MCP execution for pipeline wrappers with only
run_api_asyncimplemented by @mpangrazzi in #125
Full Changelog: v0.9.0...v0.9.1
v0.9.0
Async support for pipeline wrappers
Now you can implement run_api_async and run_chat_completion_async methods in your pipeline wrapper.
You can now also make use of async_streaming_generator if you need to stream pipeline output to from async methods.
For example:
from pathlib import Path
from typing import AsyncGenerator, List
from haystack import AsyncPipeline
from haystack.dataclasses import ChatMessage
from hayhooks import (
get_last_user_message,
BasePipelineWrapper,
async_streaming_generator,
)
SYSTEM_MESSAGE = "You are a helpful assistant that can answer questions about the world."
class PipelineWrapper(BasePipelineWrapper):
def setup(self) -> None:
pipeline_yaml = (Path(__file__).parent / "question_answer.yml").read_text()
self.pipeline = AsyncPipeline.loads(pipeline_yaml)
async def run_api_async(self, question: str) -> str:
result = await self.pipeline.run_async(
{
"prompt_builder": {
"template": [
ChatMessage.from_system(SYSTEM_MESSAGE),
ChatMessage.from_user(question),
]
}
}
)
return result["llm"]["replies"][0].text
async def run_chat_completion_async(self, model: str, messages: List[dict], body: dict) -> AsyncGenerator:
question = get_last_user_message(messages)
return async_streaming_generator(
pipeline=self.pipeline,
pipeline_run_args={
"prompt_builder": {
"template": [
ChatMessage.from_system(SYSTEM_MESSAGE),
ChatMessage.from_user(question),
]
},
},
)Full example is available here.
What's Changed
- Better example video resolution on Hayhooks Core MCP Tools docs by @mpangrazzi in #121
- Add support for async pipeline wrapper methods by @mpangrazzi in #122
Full Changelog: v0.8.0...v0.9.0

