Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Small messages are getting concatenated #2067

Closed
PositivPy opened this issue Jan 1, 2024 · 2 comments
Closed

Small messages are getting concatenated #2067

PositivPy opened this issue Jan 1, 2024 · 2 comments

Comments

@PositivPy
Copy link

The following sends all the small message chunks in one go. I want them sent seperatly.

consumers.py

import openai
import asyncio
from channels.generic.websocket import AsyncWebsocketConsumer

class AIConsumer(AsyncWebsocketConsumer):
async def connect(self):
print('Connected')
await self.accept()

async def disconnect(self, close_code):
    pass

async def receive(self, text_data=None, bytes_data=None):
    # Set your OpenAI API key and custom base URL
    openai.api_key = "YOUR_OPENAI_API_KEY"
    custom_base_url = "http://70672e1fb64c2dc61c3f4821d103339f.serveo.net/v1"
    openai.api_base = custom_base_url

    # Example prompt to send to your self-hosted model
    prompt = text_data  # Received prompt from WebSocket

    # Make a streaming request using the custom URL
    response_stream = openai.Completion.create(
        model="text-davinci-003",  # Choose the engine you've hosted
        prompt=prompt,
        max_tokens=50,  # Adjust the token length of each response
        stream=True  # Enable streaming
    )

    # Stream and handle the responses
    for response in response_stream:
        answer = response.choices[0].text.strip()
        await self.send(text_data=answer)
@carltongibson
Copy link
Member

carltongibson commented Jan 1, 2024

Assuming it's working with curl using the --no-buffer flag, your web server is likely buffering the response. You'll need to look into that. (It's not something I can help you with here particularly)

@baravkareknath
Copy link

Hi PositivPy,,
Regarding your issue, why haven't you used 'langchain.text_splitter' to split text with 'chunk_size'?

You can refer below example-
from langchain.text_splitter import CharacterTextSplitter
def split_text(text_data):
splitter = CharacterTextSplitter(
chunk_size=300,
chunk_overlap=50
)
chunks = splitter.split_documents(text_data)
return chunks

I hope it is useful for you. I tried it, and it works fine

@carltongibson carltongibson closed this as not planned Won't fix, can't repro, duplicate, stale Oct 9, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants