Skip to content

Conversation

@levonpetrosyan93
Copy link
Contributor

Fixes issue #1708 actually the reason why such tx was created.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Nov 16, 2025

Walkthrough

Replaced a per-block Spark spend limit check with a per-transaction spend limit check in CreateSparkSpendTransaction; tests adjusted to assert that oversized SparkSpend creations throw runtime errors instead of constructing blocks to reject them.

Changes

Cohort / File(s) Summary
Spark spend enforcement (implementation)
src/spark/sparkwallet.cpp
CreateSparkSpendTransaction now uses consensusParams.GetMaxValueSparkSpendPerTransaction(nHeight) instead of GetMaxValueSparkSpendPerBlock(nHeight) to validate spend value; error path / message unchanged.
Spark spend tests
src/test/spark_mintspend_test.cpp
Tests updated to expect GenerateSparkSpend to throw std::runtime_error for oversized spends (e.g., 1100 COIN, 3100 COIN); removed related block construction/processing and mempool checks for those cases. Other spend-size checks remain.

Sequence Diagram(s)

sequenceDiagram
    autonumber
    participant Wallet as Wallet/CreateSparkSpendTransaction
    participant Consensus as ConsensusParams
    participant Result as Validation

    rect rgb(240,248,255)
      Wallet->>Consensus: GetMaxValueSparkSpendPerTransaction(nHeight)
      Consensus-->>Wallet: maxAllowed
      Wallet->>Result: if spendValue > maxAllowed
    end

    alt exceed limit
      Result-->>Wallet: throw "Spend to transparent address limit exceeded."
    else within limit
      Result-->>Wallet: proceed to build transaction
    end
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~20–30 minutes

  • Inspect CreateSparkSpendTransaction for any other uses of per-block limits or related assumptions.
  • Verify unit tests' new exception expectations accurately cover consensus behavior and do not hide other failure modes.
  • Confirm no callers rely on the previous per-block semantics (wallet, mempool, or RPC flows).

Possibly related PRs

Suggested reviewers

  • psolstice

Poem

🐰 I hopped through code with tiny paws,

Checked each limit, read the laws,
Per-transaction now the rule to keep,
No more per-block counting sheep,
A little hop, a safer leap. 🥕

Pre-merge checks and finishing touches

❌ Failed checks (2 warnings)
Check name Status Explanation Resolution
Description check ⚠️ Warning The pull request description is incomplete and lacks required information. It mentions fixing issue #1708 but does not follow the repository template structure. Add 'PR intention' section explaining what the PR does (fix from per-block to per-transaction spend limit) and optionally add 'Code changes brief' section with architectural details about the change.
Docstring Coverage ⚠️ Warning Docstring coverage is 0.00% which is insufficient. The required threshold is 80.00%. You can run @coderabbitai generate docstrings to improve docstring coverage.
✅ Passed checks (1 passed)
Check name Status Explanation
Title check ✅ Passed The title 'Spark to transparent spend limit fixed' clearly and concisely summarizes the main change: fixing the spend limit logic for Spark-to-transparent transactions.
✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch limit_fixed

📜 Recent review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 0df0586 and 108288d.

📒 Files selected for processing (1)
  • src/test/spark_mintspend_test.cpp (2 hunks)
🧰 Additional context used
🧠 Learnings (1)
📓 Common learnings
Learnt from: levonpetrosyan93
Repo: firoorg/firo PR: 1493
File: src/spark/sparkwallet.cpp:745-746
Timestamp: 2024-11-10T23:59:00.992Z
Learning: When validating memo fields in Spark transactions, use `get_memo_bytes()` from the Spark parameters to get the maximum allowed memo size.
🧬 Code graph analysis (1)
src/test/spark_mintspend_test.cpp (1)
src/test/fixtures.cpp (2)
  • GenerateSparkSpend (390-413)
  • GenerateSparkSpend (390-393)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (16)
  • GitHub Check: build-guix-x86_64-apple-darwin
  • GitHub Check: build-guix-arm64-apple-darwin
  • GitHub Check: build-guix-aarch64-linux-gnu
  • GitHub Check: build-guix-x86_64-w64-mingw32
  • GitHub Check: build-guix-x86_64-linux-gnu
  • GitHub Check: build-mac-cmake
  • GitHub Check: build-windows-cmake
  • GitHub Check: build-linux-cmake
  • GitHub Check: build-guix-x86_64-apple-darwin
  • GitHub Check: build-guix-arm64-apple-darwin
  • GitHub Check: build-guix-x86_64-w64-mingw32
  • GitHub Check: build-guix-aarch64-linux-gnu
  • GitHub Check: build-guix-x86_64-linux-gnu
  • GitHub Check: build-windows-cmake
  • GitHub Check: build-mac-cmake
  • GitHub Check: build-linux-cmake
🔇 Additional comments (2)
src/test/spark_mintspend_test.cpp (2)

188-190: Correctly tests updated per-transaction limit after block 1500.

The test appropriately validates that the higher per-transaction limit (3100 COIN) is enforced after the block height threshold, using the same pattern as the earlier limit test. The logic is sound and consistent.


176-178: Verification complete: per-transaction limit enforcement correctly tested.

The call chain GenerateSparkSpendSpendAndStoreSparkCreateSparkSpendTransaction has been verified. Line 1319 in src/spark/sparkwallet.cpp confirms that CreateSparkSpendTransaction throws std::runtime_error with message "Spend to transparent address limit exceeded." when the per-transaction limit is exceeded. The exception propagates without being caught, and the test correctly validates this behavior with BOOST_CHECK_THROW, followed by mempool verification.


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@aleflm aleflm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Thanks

@reubenyap reubenyap merged commit 3d8940f into master Dec 3, 2025
21 checks passed
@reubenyap reubenyap deleted the limit_fixed branch December 3, 2025 10:38
aleflm pushed a commit to navidR/firo that referenced this pull request Dec 15, 2025
* Spark to transparent spend limit fixed

* Unittest fixed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants