Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

objectsender able to send larger payloads #2520

Open
vwnd opened this issue Jul 18, 2024 · 2 comments
Open

objectsender able to send larger payloads #2520

vwnd opened this issue Jul 18, 2024 · 2 comments
Assignees
Labels

Comments

@vwnd
Copy link
Contributor

vwnd commented Jul 18, 2024

It seems like the way that ServerTransport is batching the objects and sending to POST /objects/:streamId is not able to handle a large amount of objects, because it yields enough POST requests to that endpoint, that it restarts returning 500 errors.

@iainsproat mention it can be a firewall or other mechanism.

Problem

I have described the problem here.
Also, I have produced a codesandbox to reproduce the issue here.

Solution

The solution would be for the code on the sandbox mentioned above to work.

  • It works (the objects are sent)
  • It is fast. I am under the impression that due to the 500 errors the retry mechanism is just making the process hang before attempting again.

Technically, I would suggest enhancing the ServerTransport to compress the batch using application/zip instead of application/json, so that it can send more objects per request, meaning less total requests.

Additional context

Please have a look at the following discussion: https://speckle.community/t/is-there-a-rate-limit-on-post-objects-projectid/12310

@vwnd vwnd added enhancement New feature or request question labels Jul 18, 2024
@vwnd vwnd changed the title objectsender able to take send payloads objectsender able to send larger payloads Jul 18, 2024
Copy link

linear bot commented Jul 24, 2024

@jikatz
Copy link

jikatz commented Feb 19, 2025

Hi, I am a long time reader and first time poster here, and given that, I apologize in advance if this is a super novice question. I have been doing a lot of work lately with the objectsender and objectloader modules. While I have not experienced this exact issue, I do wonder some about the complexity of moving large payloads of objects or many related objects as JSON via REST. S3 has great support for Pre Signed URLs (seems MinIO does also) and multi-part uploads. Am I wrong to assume digital ocean and azure would not have similar capability? Is there a method currently within speckle for maybe requesting a Pre Signed URL, uploading a large batch of JSON directly to that URL which is handled by S3 (perhaps as JSONL or GZIP'd JSON) and then triggering a worker to ingest that JSON in batches directly to the database (much the same way that the IFC workers do now)? If not and it is desired, I would be more than happy to work on that. Again, sorry for the noob question. Thanks! @iainsproat @vwnd

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants