Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Hangs with remote Ollama #952

Open
axibo-reiner opened this issue Mar 16, 2025 · 2 comments
Open

Hangs with remote Ollama #952

axibo-reiner opened this issue Mar 16, 2025 · 2 comments
Labels
bug Something isn't working

Comments

@axibo-reiner
Copy link

Describe the bug
When running with a remote vs local ollama server the smart_scrapper_graph.run() hangs and does not return (left running overnight as well just in case)

  • the usage on the remote machine goes up so the requests are happening
  • same model on both remote and local machine
  • tested with all v1.4x versions
  • tested different temperatures
  • checked with and without base_url parameter when running locally (both work fine)
  • tested with and without headless
  • ollama v0.6.1 (both machines)

To Reproduce

from scrapegraphai.graphs import SmartScraperGraph

graph_config = {
    "llm": {
        "model": "ollama/llama3.2",
        "max_tokens": 8192,
        "base_url": "http://192.168.1.1:11434",
        "format": "json"
    },
    "verbose": True,
    "headless": False,
}

smart_scraper_graph = SmartScraperGraph(
    prompt="I want a list of all the links of the issues on the page",
    source="https://github.com/ScrapeGraphAI/Scrapegraph-ai/issues",
    config=graph_config
)

result = smart_scraper_graph.run()

import json
print(json.dumps(result, indent=4))

Expected behavior
A clear and concise description of what you expected to happen.

Copy link

dosubot bot commented Mar 16, 2025

Hey @axibo-reiner! I'm here to help you with any bugs, questions, or contributions you have. Let's tackle this issue together!

I found a similar discussion where a user experienced a 502 Bad Gateway error when using the Ollama API with Scrapegraph-ai, despite the Ollama service running locally and being tested successfully with curl. This issue remains unsolved [1].

For your specific issue with the smart_scraper_graph.run() method hanging when using a remote Ollama server, consider the following:

  1. Timeout Settings: Ensure that the timeout settings are configured appropriately. Scrapegraph-ai has a "graph timeout" and a "timeout" in loader_kwargs that could affect long-running requests [2][3].

  2. Network Configuration: Double-check any network configurations or firewall settings that might be affecting the communication between your local machine and the remote Ollama server. Although specific network configurations weren't found in the documentation, ensuring open communication ports and no firewall restrictions could help.

  3. Logging and Debugging: Utilize the logging functions available in Scrapegraph-ai to diagnose the issue. You can set the verbosity to debug level using set_verbosity_debug() to get more detailed logs, which might provide insights into where the process is hanging [4][5].

These steps might help in identifying the cause of the hang and resolving the issue.

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other

@dosubot dosubot bot added the bug Something isn't working label Mar 16, 2025
@axibo-reiner
Copy link
Author

--- Executing Fetch Node ---
--- (Fetching HTML from: https://github.com/ScrapeGraphAI/Scrapegraph-ai/issues) ---
--- Executing ParseNode Node ---
--- Executing GenerateAnswer Node ---

output with verbose debug enabled.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant