Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Algolia indexing not continuing when a page size is to large #250

Open
gerrit-de-heij opened this issue Mar 3, 2025 · 2 comments
Open

Comments

@gerrit-de-heij
Copy link

gerrit-de-heij commented Mar 3, 2025

I get an error when a page is to large for Algolia to index (20kb for example). The build process of the index doesn't continue with indexing the rest of the pages. I need it to skip the pages that are to large in size and index all other pages. Is there a setting that I can change so the build process skips the pages that are to large and index the rest of the pages?


This item has been added to our backlog AB#49904

@acoumb
Copy link
Contributor

acoumb commented Mar 10, 2025

Hi @gerrit-de-heij ,

Have you checked the logs for errors, whether a specific property using a complex type is throwing an exception? Because in this case you can add your own custom converter to address the matter.

Thanks,
Adrian

@gerrit-de-heij
Copy link
Author

Hi @acoumb ,

I currently don't have time to see if this helps to solve the problem. But I think writing a custom converter for a BlockGrid is still not going to fix the issue, because a BlockGrid can still contain a lot of text. When it does, it still isn't indexed by Algolia and neither are the rest of the pages after that, because it gives the error. Isn't there a way to continue indexing the rest of the pages when one page is to large?

Thanks in advance!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants