[Crowdstrike Alert] adjust batch size to API limit #13862
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
The CrowdStrike alert integrations uses 2 API endpoints from CrowdStrike:
/alerts/queries/alerts/v2 → to get the composite IDs of all open alerts → limit of 10,000 (see https://www.falconpy.io/Service-Collections/Alerts.html#getqueriesalertsv2) as It's currently also defined as the default value for the batch size in the description.
The returned composite_ids of the first API call are then sent with a post again:
/alerts/entities/alerts/v2 → to get the alert details → limit of 1,000 (see https://www.falconpy.io/Service-Collections/Alerts.html#postentitiesalertsv2)
which causes the integration to get an HTTP 413 request too large error if there are more than 1000 composite IDs returned from the first API which returns up to 10,000 IDs due to the limit.
As we already implemented a pagination using the want_more for the get, the batch size should be set to a maximum 10 1,000 instead of 10,000 to avoid the integration failing in environments with a lot of alerts.