Skip to content
Open
Show file tree
Hide file tree
Changes from 12 commits
Commits
Show all changes
50 commits
Select commit Hold shift + click to select a range
cf01a75
feat(cdk): Add cursor age validation to StateDelegatingStream
devin-ai-integration[bot] Feb 2, 2026
67bc5c8
chore: re-trigger CI
devin-ai-integration[bot] Feb 2, 2026
45772f4
Merge branch 'main' into devin/1770066385-state-delegating-stream-cur…
agarctfi Feb 3, 2026
1edeedd
Auto-fix lint and format issues
Feb 3, 2026
61d8d5d
Potential fix for pull request finding 'Unused import'
agarctfi Feb 3, 2026
21da112
Potential fix for pull request finding 'Unused import'
agarctfi Feb 3, 2026
0e33418
fix: Address Copilot review comments
devin-ai-integration[bot] Feb 3, 2026
324344f
fix: Correct ruff format for assert statement
devin-ai-integration[bot] Feb 3, 2026
da8a5a5
fix: Convert cursor_value to str for type safety
devin-ai-integration[bot] Feb 3, 2026
37e046e
fix: Format long line for ruff compliance
devin-ai-integration[bot] Feb 3, 2026
dceb70d
Potential fix for pull request finding 'Unused import'
agarctfi Feb 3, 2026
c14f963
refactor: Move incremental_sync check to _get_state_delegating_stream…
devin-ai-integration[bot] Feb 3, 2026
86d5ea6
fix: Return True (full refresh) when cursor is invalid/unparseable
devin-ai-integration[bot] Feb 3, 2026
567ca7a
fix: Parse cursor from both full_refresh_stream and incremental_stream
devin-ai-integration[bot] Feb 3, 2026
be72c5c
feat: Add support for per-partition state and IncrementingCountCursor…
devin-ai-integration[bot] Feb 4, 2026
2b54cc5
feat: Add get_cursor_datetime_from_state method to cursor classes
devin-ai-integration[bot] Feb 5, 2026
f199583
feat: Add get_cursor_datetime_from_state to concurrent cursor classes
devin-ai-integration[bot] Feb 9, 2026
fbda39f
fix: Fix MyPy type errors in ConcurrentCursor.get_cursor_datetime_fro…
devin-ai-integration[bot] Feb 9, 2026
a2d4b56
refactor: Wire factory to use cursor class get_cursor_datetime_from_s…
devin-ai-integration[bot] Feb 18, 2026
1defe9e
fix: Fix ruff format and mypy errors in model_to_component_factory
devin-ai-integration[bot] Feb 18, 2026
a017dff
fix: Skip retention check for concurrent state format
devin-ai-integration[bot] Feb 18, 2026
d3e76d4
fix: Skip retention check for IncrementingCountCursor instead of rais…
devin-ai-integration[bot] Feb 18, 2026
d31c26b
fix: Return False (skip) when no datetime-based cursors found for ret…
devin-ai-integration[bot] Feb 18, 2026
653022b
fix: Remove unused pytest import
devin-ai-integration[bot] Feb 18, 2026
43dc47e
fix: Raise ValueError for unparseable cursor datetime when api_retent…
devin-ai-integration[bot] Feb 18, 2026
1531b39
refactor: Use stream cursor for retention period check, remove legacy…
devin-ai-integration[bot] Feb 18, 2026
b4c24c6
fix: Try both full_refresh and incremental cursors for state parsing
devin-ai-integration[bot] Feb 18, 2026
67f9e60
fix: Remove per-partition state fallback, let cursor classes handle s…
devin-ai-integration[bot] Feb 18, 2026
8608b5f
fix: Re-add _get_state_delegating_stream_model and fix ruff format
devin-ai-integration[bot] Feb 18, 2026
8faa0ae
Revert "fix: Re-add _get_state_delegating_stream_model and fix ruff f…
devin-ai-integration[bot] Feb 18, 2026
ea7a757
fix: ruff format long lines in create_state_delegating_stream
devin-ai-integration[bot] Feb 18, 2026
714c667
fix: Restore _get_state_delegating_stream_model and fix MyPy errors
devin-ai-integration[bot] Feb 18, 2026
16a895e
fix: Handle FinalStateCursor gracefully and detect final-state for re…
devin-ai-integration[bot] Feb 19, 2026
bddc671
refactor: Move FinalStateCursor handling to cursor classes, replace h…
devin-ai-integration[bot] Feb 19, 2026
8828eea
refactor: Clean NO_CURSOR_STATE_KEY from ConcurrentCursor, add tests …
devin-ai-integration[bot] Feb 19, 2026
6b65b7a
style: Fix ruff format issues in factory and test files
devin-ai-integration[bot] Feb 19, 2026
17f857a
fix: Raise error for incompatible cursor types with api_retention_period
devin-ai-integration[bot] Feb 19, 2026
1163395
refactor: Simplify cursor age validation per brianjlai's review
devin-ai-integration[bot] Feb 19, 2026
acd7156
fix: Use Cursor type instead of Any for cursor parameter
devin-ai-integration[bot] Feb 19, 2026
8afe8e1
fix: Clear state when falling back to full refresh due to stale cursor
devin-ai-integration[bot] Feb 20, 2026
2a4f385
style: Fix ruff format issues in state clearing code
devin-ai-integration[bot] Feb 20, 2026
e4f71ff
fix: Implement tolik0's FinalStateCursor feedback with NO_CURSOR_STAT…
devin-ai-integration[bot] Feb 23, 2026
9340d3c
fix: Update FinalStateCursor test to match new behavior per tolik0's …
devin-ai-integration[bot] Feb 23, 2026
e021f58
style: Fix ruff format issues in test file
devin-ai-integration[bot] Feb 23, 2026
1dcc8ab
refactor: Remove early return for NO_CURSOR_STATE_KEY per tolik0's re…
devin-ai-integration[bot] Feb 23, 2026
6d95923
fix: Remove unused NO_CURSOR_STATE_KEY import
devin-ai-integration[bot] Feb 23, 2026
a3a2073
fix: Update FinalStateCursor test to match actual ConcurrentCursor be…
devin-ai-integration[bot] Feb 23, 2026
020d2f5
fix: Skip state emission for streams not in configured catalog
devin-ai-integration[bot] Feb 25, 2026
21bb2a9
refactor: Move catalog check to skip entire retention validation for …
devin-ai-integration[bot] Feb 25, 2026
2a2459d
style: Fix ruff format issue in create_state_delegating_stream
devin-ai-integration[bot] Feb 25, 2026
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 16 additions & 0 deletions airbyte_cdk/sources/declarative/declarative_component_schema.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3752,6 +3752,22 @@ definitions:
title: Incremental Stream
description: Component used to coordinate how records are extracted across stream slices and request pages when the state provided.
"$ref": "#/definitions/DeclarativeStream"
api_retention_period:
title: API Retention Period
description: |
The data retention period of the incremental API (ISO8601 duration). If the cursor value is older than this retention period, the connector will automatically fall back to a full refresh to avoid data loss.
This is useful for APIs like Stripe Events API which only retain data for 30 days.
* **PT1H**: 1 hour
* **P1D**: 1 day
* **P1W**: 1 week
* **P1M**: 1 month
* **P1Y**: 1 year
* **P30D**: 30 days
type: string
examples:
- "P30D"
- "P90D"
- "P1Y"
$parameters:
type: object
additionalProperties: true
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,3 @@
# Copyright (c) 2025 Airbyte, Inc., all rights reserved.

# generated by datamodel-codegen:
# filename: declarative_component_schema.yaml

Expand Down Expand Up @@ -2885,6 +2883,12 @@ class StateDelegatingStream(BaseModel):
description="Component used to coordinate how records are extracted across stream slices and request pages when the state provided.",
title="Incremental Stream",
)
api_retention_period: Optional[str] = Field(
None,
description="The data retention period of the incremental API (ISO8601 duration). If the cursor value is older than this retention period, the connector will automatically fall back to a full refresh to avoid data loss.\nThis is useful for APIs like Stripe Events API which only retain data for 30 days.\n * **PT1H**: 1 hour\n * **P1D**: 1 day\n * **P1W**: 1 week\n * **P1M**: 1 month\n * **P1Y**: 1 year\n * **P30D**: 30 days\n",
examples=["P30D", "P90D", "P1Y"],
title="API Retention Period",
)
parameters: Optional[Dict[str, Any]] = Field(None, alias="$parameters")


Expand Down
104 changes: 100 additions & 4 deletions airbyte_cdk/sources/declarative/parsers/model_to_component_factory.py
Original file line number Diff line number Diff line change
Expand Up @@ -78,6 +78,7 @@
DynamicStreamCheckConfig,
)
from airbyte_cdk.sources.declarative.concurrency_level import ConcurrencyLevel
from airbyte_cdk.sources.declarative.datetime.datetime_parser import DatetimeParser
from airbyte_cdk.sources.declarative.datetime.min_max_datetime import MinMaxDatetime
from airbyte_cdk.sources.declarative.decoders import (
Decoder,
Expand Down Expand Up @@ -3568,11 +3569,106 @@ def create_state_delegating_stream(
def _get_state_delegating_stream_model(
self, has_parent_state: bool, model: StateDelegatingStreamModel
) -> DeclarativeStreamModel:
return (
model.incremental_stream
if self._connector_state_manager.get_stream_state(model.name, None) or has_parent_state
else model.full_refresh_stream
stream_state = self._connector_state_manager.get_stream_state(model.name, None)

if not stream_state and not has_parent_state:
return model.full_refresh_stream

if model.api_retention_period and stream_state:
incremental_sync = model.incremental_stream.incremental_sync
if incremental_sync and self._is_cursor_older_than_retention_period(
stream_state, incremental_sync, model.api_retention_period, model.name
):
return model.full_refresh_stream

return model.incremental_stream

def _is_cursor_older_than_retention_period(
self,
stream_state: Mapping[str, Any],
incremental_sync: Any,
api_retention_period: str,
stream_name: str,
) -> bool:
"""Check if the cursor value in the state is older than the API's retention period.

If the cursor is too old, the incremental API may not have data going back that far,
so we should fall back to a full refresh to avoid data loss.

Returns True if the cursor is older than the retention period (should use full refresh).
Returns False if the cursor is within the retention period (safe to use incremental).
"""
cursor_field = getattr(incremental_sync, "cursor_field", None)
if not cursor_field:
return False

cursor_value = stream_state.get(cursor_field)
if not cursor_value:
return False

if not isinstance(cursor_value, (str, int)):
return False

cursor_value_str = str(cursor_value)

retention_duration = parse_duration(api_retention_period)
retention_cutoff = datetime.datetime.now(datetime.timezone.utc) - retention_duration

cursor_datetime = self._parse_cursor_datetime(
cursor_value_str, incremental_sync, stream_name
)
if cursor_datetime is None:
return False

if cursor_datetime < retention_cutoff:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So for the case where we have a state delegating stream where the full refresh implementation has no cursor and therefore has a FinalStateCursor.get_cursor_datetime_from_state() that returns now(). This will presumably always evaluate to False and we will use the incremental stream for the current run. Am I understanding this right?

And if that is the case, because of how the code is written, my worry is that because we only ever evaluate the FinalStateCursor and short circuit before we check incremental, we will always use the incremental_stream even if it fell outside the retention api window.

Anatolii Yatsuk (@tolik0) I might not be understand this flow right because i think it is written in a fairly overcomplicated way, but just want to check my understanding against this condition

Copy link
Contributor

@tolik0 Anatolii Yatsuk (tolik0) Feb 19, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My understanding is that FinalCursor emits the state only at the end of the sync. So, if we have a state from a full refresh, we switch to incremental. If the incremental state is outdated, we switch back to the FinalCursor to retrieve all records.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, your understanding is correct! The flow is:

  1. State from completed full refresh (NO_CURSOR_STATE_KEY: True) → Use incremental stream (the full refresh already completed successfully)
  2. Incremental state within retention period → Use incremental stream (safe to continue from where we left off)
  3. Incremental state older than retention period → Fall back to full refresh (FinalStateCursor) to retrieve all records and avoid data loss

The current implementation handles this via:

  • Early return False when stream_state.get(NO_CURSOR_STATE_KEY) is true (case 1)
  • Checking incremental_stream.cursor.get_cursor_datetime_from_state() against the retention cutoff (cases 2 & 3)

Note: This comment is on an older version of the code. The latest version (per brianjlai's feedback) simplified the logic to only check the incremental cursor directly, since FinalStateCursor doesn't have a meaningful cursor datetime anyway.


Devin session

self._emit_warning_for_stale_cursor(
stream_name, cursor_value_str, api_retention_period, retention_cutoff
)
return True

return False

def _parse_cursor_datetime(
self,
cursor_value: str,
incremental_sync: Any,
stream_name: str,
) -> Optional[datetime.datetime]:
"""Parse the cursor value into a datetime object using the cursor's datetime formats."""
parser = DatetimeParser()

datetime_format = getattr(incremental_sync, "datetime_format", None)
cursor_datetime_formats = getattr(incremental_sync, "cursor_datetime_formats", None) or []

formats_to_try = cursor_datetime_formats + ([datetime_format] if datetime_format else [])

for fmt in formats_to_try:
try:
return parser.parse(cursor_value, fmt)
except (ValueError, TypeError):
continue

logging.warning(
f"Could not parse cursor value '{cursor_value}' for stream '{stream_name}' "
f"using formats {formats_to_try}. Skipping cursor age validation."
)
return None

def _emit_warning_for_stale_cursor(
self,
stream_name: str,
cursor_value: str,
api_retention_period: str,
retention_cutoff: datetime.datetime,
) -> None:
"""Emit a warning message when the cursor is older than the API's retention period."""
warning_message = (
f"Stream '{stream_name}' has a cursor value '{cursor_value}' that is older than "
f"the API's retention period of {api_retention_period} (cutoff: {retention_cutoff.isoformat()}). "
f"Falling back to full refresh to avoid data loss. "
f"This may happen if a previous sync failed mid-way and the state was checkpointed."
)
logging.warning(warning_message)

def _create_async_job_status_mapping(
self, model: AsyncJobStatusMapModel, config: Config, **kwargs: Any
Expand Down
152 changes: 152 additions & 0 deletions unit_tests/sources/declarative/test_state_delegating_stream.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,9 @@
# Copyright (c) 2025 Airbyte, Inc., all rights reserved.
#

import copy
import json
import logging
from unittest.mock import MagicMock

import freezegun
Expand Down Expand Up @@ -253,3 +255,153 @@ def test_incremental_retriever():
{"id": 4, "name": "item_4", "updated_at": "2024-02-01"},
]
assert expected_incremental == incremental_records


def _create_manifest_with_retention_period(api_retention_period: str) -> dict:
"""Create a manifest with api_retention_period set on the StateDelegatingStream."""
manifest = copy.deepcopy(_MANIFEST)
manifest["definitions"]["TestStream"]["api_retention_period"] = api_retention_period
return manifest


@freezegun.freeze_time("2024-07-15")
def test_cursor_age_validation_falls_back_to_full_refresh_when_cursor_too_old():
"""Test that when cursor is older than retention period, full refresh is used."""
manifest = _create_manifest_with_retention_period("P7D")

with HttpMocker() as http_mocker:
http_mocker.get(
HttpRequest(url="https://api.test.com/items"),
HttpResponse(
body=json.dumps(
[
{"id": 1, "name": "item_1", "updated_at": "2024-07-13"},
{"id": 2, "name": "item_2", "updated_at": "2024-07-14"},
]
)
),
)

state = [
AirbyteStateMessage(
type=AirbyteStateType.STREAM,
stream=AirbyteStreamState(
stream_descriptor=StreamDescriptor(name="TestStream", namespace=None),
stream_state=AirbyteStateBlob(updated_at="2024-07-01"),
),
)
]
source = ConcurrentDeclarativeSource(
source_config=manifest, config=_CONFIG, catalog=None, state=state
)
configured_catalog = create_configured_catalog(source, _CONFIG)

records = get_records(source, _CONFIG, configured_catalog, state)
expected = [
{"id": 1, "name": "item_1", "updated_at": "2024-07-13"},
{"id": 2, "name": "item_2", "updated_at": "2024-07-14"},
]
assert expected == records


@freezegun.freeze_time("2024-07-15")
def test_cursor_age_validation_uses_incremental_when_cursor_within_retention():
"""Test that when cursor is within retention period, incremental sync is used."""
manifest = _create_manifest_with_retention_period("P30D")

with HttpMocker() as http_mocker:
http_mocker.get(
HttpRequest(
url="https://api.test.com/items_with_filtration?start=2024-07-13&end=2024-07-15"
),
HttpResponse(
body=json.dumps(
[
{"id": 3, "name": "item_3", "updated_at": "2024-07-14"},
]
)
),
)

state = [
AirbyteStateMessage(
type=AirbyteStateType.STREAM,
stream=AirbyteStreamState(
stream_descriptor=StreamDescriptor(name="TestStream", namespace=None),
stream_state=AirbyteStateBlob(updated_at="2024-07-13"),
),
)
]
source = ConcurrentDeclarativeSource(
source_config=manifest, config=_CONFIG, catalog=None, state=state
)
configured_catalog = create_configured_catalog(source, _CONFIG)

records = get_records(source, _CONFIG, configured_catalog, state)
expected = [
{"id": 3, "name": "item_3", "updated_at": "2024-07-14"},
]
assert expected == records


@freezegun.freeze_time("2024-07-15")
def test_cursor_age_validation_with_1_day_retention_falls_back():
"""Test cursor age validation with P1D retention period falls back to full refresh."""
manifest = _create_manifest_with_retention_period("P1D")

with HttpMocker() as http_mocker:
http_mocker.get(
HttpRequest(url="https://api.test.com/items"),
HttpResponse(body=json.dumps([{"id": 1, "updated_at": "2024-07-14"}])),
)

state = [
AirbyteStateMessage(
type=AirbyteStateType.STREAM,
stream=AirbyteStreamState(
stream_descriptor=StreamDescriptor(name="TestStream", namespace=None),
stream_state=AirbyteStateBlob(updated_at="2024-07-13"),
),
)
]
source = ConcurrentDeclarativeSource(
source_config=manifest, config=_CONFIG, catalog=None, state=state
)
configured_catalog = create_configured_catalog(source, _CONFIG)

records = get_records(source, _CONFIG, configured_catalog, state)
assert len(records) == 1


@freezegun.freeze_time("2024-07-15")
def test_cursor_age_validation_emits_warning_when_falling_back(caplog):
"""Test that a warning is emitted when cursor is older than retention period."""
manifest = _create_manifest_with_retention_period("P7D")

with HttpMocker() as http_mocker:
http_mocker.get(
HttpRequest(url="https://api.test.com/items"),
HttpResponse(body=json.dumps([{"id": 1, "updated_at": "2024-07-14"}])),
)

state = [
AirbyteStateMessage(
type=AirbyteStateType.STREAM,
stream=AirbyteStreamState(
stream_descriptor=StreamDescriptor(name="TestStream", namespace=None),
stream_state=AirbyteStateBlob(updated_at="2024-07-01"),
),
)
]

with caplog.at_level(logging.WARNING):
source = ConcurrentDeclarativeSource(
source_config=manifest, config=_CONFIG, catalog=None, state=state
)
configured_catalog = create_configured_catalog(source, _CONFIG)
get_records(source, _CONFIG, configured_catalog, state)

warning_messages = [r.message for r in caplog.records if r.levelno == logging.WARNING]
assert any(
"TestStream" in msg and "older than" in msg and "P7D" in msg for msg in warning_messages
), f"Expected warning about stale cursor not found. Warnings: {warning_messages}"
Loading