Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Setting a custom http_client fails with unexpected keyword argument when use ChatAnthropic #30146

Open
5 tasks done
zhanghao-AI opened this issue Mar 7, 2025 · 5 comments
Open
5 tasks done
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature investigate Flagged for investigation.

Comments

@zhanghao-AI
Copy link

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

from langchain_anthropic import ChatAnthropic

llm = ChatAnthropic(
thinking={
"type": "enabled",
"budget_tokens": 1024
},
model="claude-3-7-sonnet-20250219",
max_tokens=4096,
http_client=httpx.Client()
)

Error Message and Stack Trace (if applicable)

File "test_claude37_langchain.py", line 46, in
llm_result = llm.invoke(prompt_result)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "\Lib\site-packages\langchain_core\language_models\chat_models.py", line 285, in invoke
self.generate_prompt(
File "\Lib\site-packages\langchain_core\language_models\chat_models.py", line 861, in generate_prompt
return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "\Lib\site-packages\langchain_core\language_models\chat_models.py", line 691, in generate
self._generate_with_cache(
File "\Lib\site-packages\langchain_core\language_models\chat_models.py", line 926, in _generate_with_cache
result = self._generate(
^^^^^^^^^^^^^^^
File "\Lib\site-packages\langchain_anthropic\chat_models.py", line 948, in _generate
data = self._client.messages.create(**payload)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "\Lib\site-packages\anthropic_utils_utils.py", line 275, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
TypeError: Messages.create() got an unexpected keyword argument 'http_client'

Description

When I use ChatAnthropic, I want to customize http_client. I noticed that ChatOpenAI can do this. But when I use ChatAnthropic, it doesn't work. Does ChatAnthropic not handle this parameter? I can set it using the anthropic SDK. Do you plan to add this parameter in the next version?

System Info

System Information

OS: Windows
OS Version: 10.0.22631
Python Version: 3.11.11 | packaged by Anaconda, Inc. | (main, Dec 11 2024, 16:34:19) [MSC v.1929 64 bit (AMD64)]

Package Information

langchain_core: 0.3.41
langchain: 0.3.20
langchain_community: 0.3.19
langsmith: 0.1.147
langchain_anthropic: 0.3.9
langchain_elasticsearch: 0.3.2
langchain_experimental: 0.3.4
langchain_google_genai: 2.0.10
langchain_openai: 0.3.7
langchain_pinecone: 0.2.3
langchain_tests: 0.3.12
langchain_text_splitters: 0.3.6
langchainhub: 0.1.21
langgraph_sdk: 0.1.53

Optional packages not installed

langserve

Other Dependencies

aiohttp<3.11,>=3.10: Installed. No version info available.
aiohttp<4.0.0,>=3.8.3: Installed. No version info available.
anthropic<1,>=0.47.0: Installed. No version info available.
async-timeout<5.0.0,>=4.0.0;: Installed. No version info available.
dataclasses-json<0.7,>=0.5.7: Installed. No version info available.
elasticsearch[vectorstore-mmr]: Installed. No version info available.
filetype: 1.2.0
google-generativeai: 0.8.4
httpx: 0.28.1
httpx-sse<1.0.0,>=0.4.0: Installed. No version info available.
httpx<1,>=0.25.0: Installed. No version info available.
jsonpatch<2.0,>=1.33: Installed. No version info available.
langchain-anthropic;: Installed. No version info available.
langchain-aws;: Installed. No version info available.
langchain-cohere;: Installed. No version info available.
langchain-community;: Installed. No version info available.
langchain-core<1.0.0,>=0.3.34: Installed. No version info available.
langchain-core<1.0.0,>=0.3.35: Installed. No version info available.
langchain-core<1.0.0,>=0.3.39: Installed. No version info available.
langchain-core<1.0.0,>=0.3.41: Installed. No version info available.
langchain-deepseek;: Installed. No version info available.
langchain-fireworks;: Installed. No version info available.
langchain-google-genai;: Installed. No version info available.
langchain-google-vertexai;: Installed. No version info available.
langchain-groq;: Installed. No version info available.
langchain-huggingface;: Installed. No version info available.
langchain-mistralai;: Installed. No version info available.
langchain-ollama;: Installed. No version info available.
langchain-openai;: Installed. No version info available.
langchain-tests<1.0.0,>=0.3.7: Installed. No version info available.
langchain-text-splitters<1.0.0,>=0.3.6: Installed. No version info available.
langchain-together;: Installed. No version info available.
langchain-xai;: Installed. No version info available.
langchain<1.0.0,>=0.3.20: Installed. No version info available.
langsmith-pyo3: Installed. No version info available.
langsmith<0.4,>=0.1.125: Installed. No version info available.
langsmith<0.4,>=0.1.17: Installed. No version info available.
numpy<2.0.0,>=1.24.0;: Installed. No version info available.
numpy<2.0.0,>=1.26.4: Installed. No version info available.
numpy<3,>=1.26.2: Installed. No version info available.
numpy<3,>=1.26.2;: Installed. No version info available.
openai<2.0.0,>=1.58.1: Installed. No version info available.
orjson: 3.10.15
packaging: 24.2
packaging<25,>=23.2: Installed. No version info available.
pinecone<6.0.0,>=5.4.0: Installed. No version info available.
pydantic: 2.10.6
pydantic-settings<3.0.0,>=2.4.0: Installed. No version info available.
pydantic<3.0.0,>=2.5.2;: Installed. No version info available.
pydantic<3.0.0,>=2.7.4: Installed. No version info available.
pydantic<3.0.0,>=2.7.4;: Installed. No version info available.
pytest-asyncio<1,>=0.20: Installed. No version info available.
pytest-socket<1,>=0.6.0: Installed. No version info available.
pytest<9,>=7: Installed. No version info available.
PyYAML>=5.3: Installed. No version info available.
requests: 2.32.3
requests-toolbelt: 1.0.0
requests<3,>=2: Installed. No version info available.
SQLAlchemy<3,>=1.4: Installed. No version info available.
syrupy<5,>=4: Installed. No version info available.
tenacity!=8.4.0,<10,>=8.1.0: Installed. No version info available.
tenacity!=8.4.0,<10.0.0,>=8.1.0: Installed. No version info available.
tiktoken<1,>=0.7: Installed. No version info available.
types-requests: 2.32.0.20250301
typing-extensions>=4.7: Installed. No version info available.

@langcarl langcarl bot added the investigate Flagged for investigation. label Mar 7, 2025
@dosubot dosubot bot added the 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature label Mar 7, 2025
@chakravarthik27
Copy link

Hi @zhanghao-AI ,

are you working on this feature?

@zhanghao-AI
Copy link
Author

Hi @zhanghao-AI ,

are you working on this feature?

I encountered this problem when I used langchain to call the claude-3-7-sonnet-20250219 model, and it has not been solved yet. Do you have any solution?

@chakravarthik27
Copy link

No, but I will fix this issue as soon as possible.

@zhouruiliangxian
Copy link


TypeError Traceback (most recent call last)
Cell In[1], line 2
1 from langchain_openai import ChatOpenAI
----> 2 llm = ChatOpenAI(
3 temperature=0.95,
4 model="GLM-4-Air",
5 openai_api_key="",
6 openai_api_base="https://open.bigmodel.cn/api/paas/v4/",
7 )

File d:\soft\anaconda\envs\langchain\Lib\site-packages\langchain_core\load\serializable.py:125, in Serializable.init(self, *args, **kwargs)
123 def init(self, *args: Any, **kwargs: Any) -> None:
124 """"""
--> 125 super().init(*args, **kwargs)

[... skipping hidden 1 frame]

File d:\soft\anaconda\envs\langchain\Lib\site-packages\langchain_openai\chat_models\base.py:527, in BaseChatOpenAI.validate_environment(self)
525 self.http_client = httpx.Client(proxy=self.openai_proxy)
526 sync_specific = {"http_client": self.http_client}
--> 527 self.root_client = openai.OpenAI(**client_params, **sync_specific) # type: ignore[arg-type]
528 self.root_client = openai.OpenAI(**client_params) # type: ignore[arg-type]
529 self.client = self.root_client.chat.completions

File d:\soft\anaconda\envs\langchain\Lib\site-packages\openai_client.py:123, in OpenAI.init(self, api_key, organization, project, base_url, timeout, max_retries, default_headers, default_query, http_client, _strict_response_validation)
120 if base_url is None:
121 base_url = f"https://api.openai.com/v1"
--> 123 super().init(
124 version=version,
125 base_url=base_url,
126 max_retries=max_retries,
127 timeout=timeout,
128 http_client=http_client,
129 custom_headers=default_headers,
130 custom_query=default_query,
131 _strict_response_validation=_strict_response_validation,
132 )
134 self._default_stream_cls = Stream
136 self.completions = resources.Completions(self)

File d:\soft\anaconda\envs\langchain\Lib\site-packages\openai_base_client.py:856, in SyncAPIClient.init(self, version, base_url, max_retries, timeout, transport, proxies, limits, http_client, custom_headers, custom_query, _strict_response_validation)
839 raise TypeError(
840 f"Invalid http_client argument; Expected an instance of httpx.Client but got {type(http_client)}"
841 )
843 super().init(
844 version=version,
845 limits=limits,
(...) 854 _strict_response_validation=_strict_response_validation,
855 )
--> 856 self._client = http_client or SyncHttpxClientWrapper(
857 base_url=base_url,
858 # cast to a valid type because mypy doesn't understand our type narrowing
859 timeout=cast(Timeout, timeout),
860 proxies=proxies,
861 transport=transport,
862 limits=limits,
863 follow_redirects=True,
864 )

File d:\soft\anaconda\envs\langchain\Lib\site-packages\openai_base_client.py:754, in _DefaultHttpxClient.init(self, **kwargs)
752 kwargs.setdefault("limits", DEFAULT_CONNECTION_LIMITS)
753 kwargs.setdefault("follow_redirects", True)
--> 754 super().init(**kwargs)

TypeError: Client.init() got an unexpected keyword argument 'proxies'

I also have the same problem, what the problem is?

@chakravarthik27
Copy link

proxies

A dependencies issue arose with httpx. To fix this, you can either upgrade the OpenAI SDK or downgrade the httpx version.

References:
https://community.openai.com/t/error-with-openai-1-56-0-client-init-got-an-unexpected-keyword-argument-proxies/1040332/12

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature investigate Flagged for investigation.
Projects
None yet
Development

No branches or pull requests

3 participants