-
-
Notifications
You must be signed in to change notification settings - Fork 368
unified sendtoaddress rpc #1700
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
unified sendtoaddress rpc #1700
Conversation
WalkthroughEnhances wallet RPCs: adds ValidateFeeSubtractionAmount(), extends sendtoaddress to accept simple and JSON-object multi-address formats, and introduces sendtransparent, sendspark, and sendsparkmany. Adds comprehensive RPC tests covering transparent, Spark, and BIP47/RAP flows with per-address fee subtraction and metadata handling. Changes
Sequence Diagram(s)sequenceDiagram
participant User
participant RPC as sendtoaddress / sendtransparent
participant Validator as Fee & Balance Validator
participant Spark as Spark Resolver / Handler
participant TxBuilder as Transaction Builder
participant Wallet
User->>RPC: Request (simple or JSON multi-dest)
alt single simple format
RPC->>RPC: Parse single address+flags
else JSON multi-address
RPC->>RPC: Parse destinations array (amount, subtractFee, memo, comment, comment_to)
end
RPC->>Validator: Validate per-address fee subtraction (ValidateFeeSubtractionAmount)
Validator-->>RPC: ok / error
alt Spark / BIP47 address present
RPC->>Spark: Resolve name / handle RAP/BIP47 / ensure unlock
Spark-->>RPC: mint/spend decisions
end
RPC->>Validator: Check combined balances across types
Validator-->>RPC: ok / insufficient funds
RPC->>TxBuilder: Build transaction (apply per-output fee subtraction)
TxBuilder->>Wallet: Sign and commit
Wallet-->>RPC: txid
RPC-->>User: txid and aggregated metadata
sequenceDiagram
participant User
participant sendspark as sendspark / sendsparkmany
participant SparkWallet as Spark Wallet
participant BalanceCheck as Balance Checker
participant TxBuilder as Tx Builder
User->>sendspark: destinations + per-dest fee flags
sendspark->>SparkWallet: Ensure unlocked / prepare mint inputs
sendspark->>BalanceCheck: Verify Spark balance covers total
BalanceCheck-->>sendspark: ok / fail
loop per destination
sendspark->>TxBuilder: add output (apply fee subtraction if enabled)
end
TxBuilder->>SparkWallet: Build/sign Spark mint/spend tx
SparkWallet-->>sendspark: txid
sendspark-->>User: txid and per-output info
Estimated code review effort🎯 4 (Complex) | ⏱️ ~50 minutes
Possibly related PRs
Suggested reviewers
Poem
Pre-merge checks and finishing touches❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✨ Finishing touches
🧪 Generate unit tests (beta)
Tip 📝 Customizable high-level summaries are now available in beta!You can now customize how CodeRabbit generates the high-level summary in your pull requests — including its content, structure, tone, and formatting.
Example instruction:
Note: This feature is currently in beta for Pro-tier users, and pricing will be announced later. Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
3424c1b to
9a0adca
Compare
|
@coderabbitai review |
✅ Actions performedReview triggered.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 24
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (6)
src/leveldb/include/leveldb/env.h (1)
327-386: EnvWrapper doesn’t forward Shutdown(); calls may silently no-opNew virtual Env::Shutdown() isn’t forwarded by EnvWrapper. If any code uses an EnvWrapper, Shutdown() will hit the base no-op and skip the underlying env’s shutdown logic.
Apply this diff to forward Shutdown():
class LEVELDB_EXPORT EnvWrapper : public Env { public: // Initialize an EnvWrapper that delegates all calls to *t. explicit EnvWrapper(Env* t) : target_(t) {} virtual ~EnvWrapper(); @@ void SleepForMicroseconds(int micros) override { target_->SleepForMicroseconds(micros); } + + void Shutdown() override { + target_->Shutdown(); + }src/qt/bitcoin.cpp (1)
475-502: Fix comment/implementation mismatch and remove redundant window display call.Line 497 contains a documentation bug: the comment states "Delay shutdown signal by 500 milliseconds" but the code implements a 1000ms delay via
QTimer::singleShot(1000, ...).Additionally, the
showCloseWindow()call at line 487 is redundant.ShutdownWindow::showShutdownWindow()(line 480) already callsshow()on the window before returning it, so the subsequent call toshowCloseWindow()(which simply callsshutdownWindow->show()again) has no effect and should be removed.Recommended changes:
- Update line 497 comment to:
// Delay shutdown signal by 1000 milliseconds- Remove the
showCloseWindow();call at line 487src/qt/bitcoingui.cpp (1)
307-365: Shortcuts are constructed incorrectly: "Alt+49" instead of "Alt+1", and macOS should prefer Cmd+number.QString("Alt+%1").arg(key++) with key initialized to Qt::Key_1 yields “Alt+49/50…”, which Qt won’t parse as digit shortcuts. Build them from Qt key enums instead; also prefer Meta (Cmd) on macOS.
- size_t key = Qt::Key_1; + int key = Qt::Key_1; @@ - overviewAction->setShortcut(QKeySequence(QString("Alt+%1").arg(key++))); +#ifdef Q_OS_MAC + overviewAction->setShortcut(QKeySequence(QKeyCombination(Qt::MetaModifier, static_cast<Qt::Key>(key++)))); +#else + overviewAction->setShortcut(QKeySequence(Qt::ALT | static_cast<Qt::Key>(key++))); +#endif @@ - sendCoinsAction->setShortcut(QKeySequence(QString("Alt+%1").arg(key++))); +#ifdef Q_OS_MAC + sendCoinsAction->setShortcut(QKeySequence(QKeyCombination(Qt::MetaModifier, static_cast<Qt::Key>(key++)))); +#else + sendCoinsAction->setShortcut(QKeySequence(Qt::ALT | static_cast<Qt::Key>(key++))); +#endif @@ - receiveCoinsAction->setShortcut(QKeySequence(QString("Alt+%1").arg(key++))); +#ifdef Q_OS_MAC + receiveCoinsAction->setShortcut(QKeySequence(QKeyCombination(Qt::MetaModifier, static_cast<Qt::Key>(key++)))); +#else + receiveCoinsAction->setShortcut(QKeySequence(Qt::ALT | static_cast<Qt::Key>(key++))); +#endif @@ - historyAction->setShortcut(QKeySequence(QString("Alt+%1").arg(key++))); +#ifdef Q_OS_MAC + historyAction->setShortcut(QKeySequence(QKeyCombination(Qt::MetaModifier, static_cast<Qt::Key>(key++)))); +#else + historyAction->setShortcut(QKeySequence(Qt::ALT | static_cast<Qt::Key>(key++))); +#endif @@ - lelantusAction->setShortcut(QKeySequence(QString("Alt+%1").arg(key++))); +#ifdef Q_OS_MAC + lelantusAction->setShortcut(QKeySequence(QKeyCombination(Qt::MetaModifier, static_cast<Qt::Key>(key++)))); +#else + lelantusAction->setShortcut(QKeySequence(Qt::ALT | static_cast<Qt::Key>(key++))); +#endif @@ -#ifdef Q_OS_MAC - masternodeAction->setShortcut(QKeySequence(QString("Alt+%1").arg(key++))); -#else - masternodeAction->setShortcut(QKeySequence(QString("Alt+%1").arg(key++))); -#endif +#ifdef Q_OS_MAC + masternodeAction->setShortcut(QKeySequence(QKeyCombination(Qt::MetaModifier, static_cast<Qt::Key>(key++)))); +#else + masternodeAction->setShortcut(QKeySequence(Qt::ALT | static_cast<Qt::Key>(key++))); +#endifREADME.md (2)
110-114: Debian/Ubuntu GUI deps reference Qt5; align to Qt6 (or clarify when using depends)If GUI uses Qt6, switch to Qt6 packages or note that depends builds Qt itself and system Qt is unnecessary.
-sudo apt-get install python git curl build-essential cmake pkg-config -# Also needed for GUI wallet only: -sudo apt-get install qttools5-dev qttools5-dev-tools libxcb-xkb-dev bison +sudo apt-get install python git curl build-essential cmake pkg-config +# If building GUI against system Qt (not via depends), install Qt6: +# sudo apt-get install qt6-base-dev qt6-tools-dev qt6-tools-dev-tools libxcb-xkb-dev bison
242-247: Prefix example: avoid installing to /usr/bin as prefixThe prefix should be a directory root like
/usr, not/usr/bin.-./FiroCore-0.14.14-Linux.sh --prefix=/usr/bin --exclude-subdir +./FiroCore-0.14.14-Linux.sh --prefix=/usr --exclude-subdirsrc/wallet/rpcwallet.cpp (1)
74-79: Misleading error message in EnsureSparkWalletIsAvailable.Message mentions “lelantus mint/joinsplit” instead of Spark.
- throw JSONRPCError(RPC_WALLET_ERROR, "lelantus mint/joinsplit is not allowed for legacy wallet"); + throw JSONRPCError(RPC_WALLET_ERROR, "Spark is not allowed for legacy wallet");
🧹 Nitpick comments (34)
src/init.cpp (1)
45-45: Optional: dedupe repeated includesnetfulfilledman.h and flat-database.h appear twice; remove duplicates to speed up builds slightly and reduce noise.
src/leveldb/util/env_windows.cc (2)
750-769: Prevent new work from being queued after Shutdown()Schedule() still accepts tasks post‑shutdown, which can keep the thread alive indefinitely if producers enqueue late. Gate Schedule() after shutdown.
void WindowsEnv::Schedule( void (*background_work_function)(void* background_work_arg), void* background_work_arg) { background_work_mutex_.Lock(); + // Drop new work after shutdown is requested. + if (shutdown_requested_) { + background_work_mutex_.Unlock(); + return; + } + // Start the background thread, if we haven't done so already. if (!started_background_thread_) { started_background_thread_ = true; std::thread background_thread(WindowsEnv::BackgroundThreadEntryPoint, this); background_thread.detach(); } @@ background_work_queue_.emplace(background_work_function, background_work_arg); background_work_mutex_.Unlock(); }
771-792: Document/adjust drain vs abort behaviorThe worker exits only when the queue is empty. If you want “abort immediately,” add a check before popping to clear the queue or return early. If “drain then exit” is desired, add a brief comment here to codify the intent.
src/leveldb/util/env_posix.cc (2)
800-819: Prevent enqueue after Shutdown()Mirror the Windows gating so no new background tasks are accepted after shutdown is requested.
void PosixEnv::Schedule( void (*background_work_function)(void* background_work_arg), void* background_work_arg) { background_work_mutex_.Lock(); + // Drop new work after shutdown is requested. + if (shutdown_requested_) { + background_work_mutex_.Unlock(); + return; + } + // Start the background thread, if we haven't done so already. if (!started_background_thread_) { started_background_thread_ = true; std::thread background_thread(PosixEnv::BackgroundThreadEntryPoint, this); background_thread.detach(); } @@ background_work_queue_.emplace(background_work_function, background_work_arg); background_work_mutex_.Unlock(); }
821-842: Clarify drain semanticsLike Windows, this drains queued items then exits. If that’s intentional, add a comment; else, bail out even when the queue isn’t empty.
contrib/guix/libexec/build.sh (2)
368-378: Darwin env: avoid flip‑flopping include/CC varsYou export include paths globally (Lines 80-84) then unset them here for Darwin; also unset CC/CXX/CPP. Prefer not exporting those for Darwin in the first place to reduce surprises; keep toolchain.cmake in control.
Supporting change outside this hunk (gate early exports to non‑Darwin):
- export C_INCLUDE_PATH="${NATIVE_GCC}/include" - export CPLUS_INCLUDE_PATH="${NATIVE_GCC}/include/c++:${NATIVE_GCC}/include" - export OBJC_INCLUDE_PATH="${NATIVE_GCC}/include" - export OBJCPLUS_INCLUDE_PATH="${NATIVE_GCC}/include/c++:${NATIVE_GCC}/include" +if [[ "$HOST" != *darwin* ]]; then + export C_INCLUDE_PATH="${NATIVE_GCC}/include" + export CPLUS_INCLUDE_PATH="${NATIVE_GCC}/include/c++:${NATIVE_GCC}/include" + export OBJC_INCLUDE_PATH="${NATIVE_GCC}/include" + export OBJCPLUS_INCLUDE_PATH="${NATIVE_GCC}/include/c++:${NATIVE_GCC}/include" +fiPlease also confirm the toolchain file picks the intended compilers on Darwin once CC/CXX are unset.
383-383: Deduplicate linker flags passed to CMakeYou supply LDFLAGS via env and also via -DCMAKE_{EXE,SHARED}_LINKER_FLAGS, which can cause duplicated/ordering issues. Pick one. Suggest removing the env LDFLAGS here.
-env PKG_CONFIG_PATH="${BASEPREFIX}/${HOST}/lib/pkgconfig:$PKG_CONFIG_PATH" CFLAGS="${HOST_CFLAGS}" CXXFLAGS="${HOST_CXXFLAGS}" OBJCXXFLAGS="${HOST_OBJCXXFLAGS}" LDFLAGS="${HOST_LDFLAGS}" \ +env PKG_CONFIG_PATH="${BASEPREFIX}/${HOST}/lib/pkgconfig:$PKG_CONFIG_PATH" CFLAGS="${HOST_CFLAGS}" CXXFLAGS="${HOST_CXXFLAGS}" OBJCXXFLAGS="${HOST_OBJCXXFLAGS}" \src/qt/guiutil.cpp (1)
310-319: Make suffix extraction regex simpler and stricterAvoid relying on inverted greediness; match first
*.<ext>directly and capture<ext>.- QRegularExpression filter_re(QStringLiteral(".* \\(\\*\\.(.*)[ \\)]"), QRegularExpression::InvertedGreedinessOption); + static const QRegularExpression filter_re( + QStringLiteral(R"(\*\.\s*([^\s\)]+))"), + QRegularExpression::UseUnicodePropertiesOption);depends/patches/qt/qtbase_avoid_native_float16.patch (1)
12-37: Scope the float16 workaround to where it’s neededUnconditionally disabling native float16 can regress performance/ABI on non-Linux targets. Consider scoping to Linux (or a build macro) and retaining upstream detection elsewhere. At minimum, document clearly in the patch header when to drop it.
Dockerfile (1)
22-25: Consider usingctestinstead ofmake test.The build sequence looks correct. However, Line 24 uses
make testto run tests. Consider using the more portable CMake-native approach:- cd build && make test && \ + ctest --test-dir build --output-on-failure && \This is more idiomatic and doesn't assume a Make generator.
Note: The pipeline failure reported at line 19 is in the depends build step, not in these CMake commands.
src/qt/bitcoin.cpp (1)
504-506: Consider adding a null check.For defensive coding, consider checking
shutdownWindowbefore dereferencing:void BitcoinApplication::showCloseWindow(){ + if (!shutdownWindow) return; shutdownWindow->show(); }This prevents potential crashes if
showCloseWindow()is called beforeshutdownWindowis initialized or after it's reset.depends/patches/qt/qttools_skip_dependencies.patch (1)
3-4: Typo in patch header.“installatiion” → “installation” for clarity in patch notes.
depends/Makefile (1)
275-276: Install flow switched to toolchain.cmake.Looks good and aligns with CMake-first builds. Consider updating any docs/scripts that mentioned config.site.
depends/packages/native_qt.mk (2)
42-91: Configure options look lean; consider explicitly disabling tests/examples.To keep builds minimal and deterministic, add -no-feature-testlib (already present), and also ensure tests/examples are fully off: -no-feature-example-compat (if applicable) plus CMake -DQT_BUILD_TESTS=OFF -DQT_BUILD_EXAMPLES=OFF.
$(package)_cmake_opts := -DCMAKE_EXE_LINKER_FLAGS="$$(build_LDFLAGS)" +$(package)_cmake_opts += -DQT_BUILD_TESTS=OFF -DQT_BUILD_EXAMPLES=OFF
48-59: Static tools portability.-static is great for hermeticity but can cause portability issues on some hosts (e.g., glibc static). If not strictly required, consider dropping -static or gating by host OS.
Also applies to: 72-89
src/qt/CMakeLists.txt (2)
156-159: SOURCE path should match the path added to the target.Use the same absolute path you add in target_sources so the property applies reliably.
-set_property(SOURCE macnotificationhandler.mm +set_property(SOURCE ${CMAKE_CURRENT_SOURCE_DIR}/macnotificationhandler.mm # Ignore warnings "'NSUserNotificationCenter' is deprecated: first deprecated in macOS 11.0". APPEND PROPERTY COMPILE_OPTIONS -Wno-deprecated-declarations )
166-188: Minor: duplicate Qt6::Network links.Qt6::Network is linked both PUBLIC (firoqt) and PRIVATE (wallet ON). One is enough; keep the PUBLIC link.
target_link_libraries(firoqt PUBLIC leveldb secp256k1 univalue secp256k1pp Qt6::Widgets - Qt6::Network + Qt6::Network @@ target_link_libraries(firoqt PRIVATE firo_wallet - Qt6::Network )Also applies to: 230-235
depends/patches/native_qt/qtbase_avoid_native_float16.patch (1)
16-35: Patch verification complete; consider adding removal condition documentation.The patch is correctly applied to Qt 6.7.3 and verified safe:
- Qt 6.7.3 introduced the problematic _Float16 type causing ABI issues on platforms without SSE2
- Patch structure matches Qt 6.7.3's qtypes.h layout (line 258, namespace QtPrivate)
- No local codebase dependencies on QFLOAT16_IS_NATIVE or NativeFloat16Type found
- Approach mirrors Qt's internal fallback and matches downstream distributions (e.g., Gentoo)
The suggested documentation note about dropping this patch once minimum libgcc ≥ 12.x is worthwhile—consider adding it to
depends/README.mdor a comment independs/packages/qt_details.mkto guide future maintainers.src/qt/notificator.cpp (2)
185-207: Consider adding “image-data” hint for broader compatibilitySome desktop notification servers prefer the “image-data” hint. Keeping “icon_data” is fine; consider setting both keys to the same variant for maximum interoperability.
- hints["icon_data"] = FreedesktopImage::toVariant(tmpicon.pixmap(FREEDESKTOP_NOTIFICATION_ICON_SIZE).toImage()); + auto v = FreedesktopImage::toVariant(tmpicon.pixmap(FREEDESKTOP_NOTIFICATION_ICON_SIZE).toImage()); + hints["icon_data"] = v; + hints["image-data"] = v; // optional compatibility
35-36: Tiny modernizations: defaulted ctor and nullptrNo behavior change; improves readability and modern C++ usage.
- FreedesktopImage() {} + FreedesktopImage() = default; @@ - ,interface(0) + ,interface(nullptr)Also applies to: 69-70
depends/packages/libxcb_util_cursor.mk (1)
8-31: Configure flags look fine; optional: disable silent rules for easier CI logsNot required, but helps debugging.
$(package)_config_opts = --enable-static --disable-shared -$(package)_config_opts += --disable-dependency-tracking --enable-option-checking +$(package)_config_opts += --disable-dependency-tracking --enable-option-checking --disable-silent-rulesREADME.md (2)
166-172: Prefer ctest for running tests with CMakeUse ctest for proper reporting and parallelization.
-cmake -B build --toolchain depends/$(depends/config.guess)/toolchain.cmake -DBUILD_TESTS=ON -cmake --build build -j$(nproc) -cd build && make test +cmake -B build --toolchain depends/$(depends/config.guess)/toolchain.cmake -DBUILD_TESTS=ON +cmake --build build -j$(nproc) +cd build && ctest --output-on-failure
196-205: Simplify cross-compilation headersThere are two adjacent headers (“Supported Cross-Compilation Targets” and “## Cross-Compilation”). Merge to one to avoid duplication.
-### Supported Cross-Compilation Targets - -| Host Target | Platform | -|--------------------------|---------------------------| -## Cross-Compilation +## Cross-Compilationsrc/qt/automintmodel.cpp (1)
95-97: Use size_t for vout index to match container size typePrevents signed/width mismatches.
- for (uint32_t i = 0; i != it->second.tx->vout.size(); ++i) { + for (size_t i = 0; i != it->second.tx->vout.size(); ++i) {depends/funcs.mk (2)
143-143: Use $(build_TAR) wrapper for extraction for portability.Hard-coding
tardrops the abstraction used elsewhere and can break on environments without GNU tar.-$(1)_extract_cmds ?= mkdir -p $$($(1)_extract_dir) && echo "$$($(1)_sha256_hash) $$($(1)_source)" > $$($(1)_extract_dir)/.$$($(1)_file_name).hash && $(build_SHA256SUM) -c $$($(1)_extract_dir)/.$$($(1)_file_name).hash && tar --no-same-owner --strip-components=1 -xf $$($(1)_source) +$(1)_extract_cmds ?= mkdir -p $$($(1)_extract_dir) && echo "$$($(1)_sha256_hash) $$($(1)_source)" > $$($(1)_extract_dir)/.$$($(1)_file_name).hash && $(build_SHA256SUM) -c $$($(1)_extract_dir)/.$$($(1)_file_name).hash && $(build_TAR) --no-same-owner --strip-components=1 -xf $$($(1)_source)
45-74: Large commented blocks left in the build macros.Dead/commented code (local-dir fetch, alt properties, logging hook, includes) adds noise and risks drift.
Please delete or gate under a feature flag (e.g., ifdef USE_LOCAL_SOURCES) to keep funcs.mk lean.
Also applies to: 110-125, 149-158, 370-372, 285-288, 107-109
src/wallet/rpcwallet.cpp (4)
4612-4624: Spark address detection in sendsparkmany is brittle.Prefix checks (“st”, “@”) miss valid Spark prefixes like “sr1…”. Use decode like elsewhere.
- bool isSparkAddress = false; - if (name_.length() >= 2 && name_.substr(0, 2) == "st") { - isSparkAddress = true; - } else if (!name_.empty() && name_[0] == '@') { - isSparkAddress = true; // Spark name - } + bool isSparkAddress = false; + { + const spark::Params* sp = spark::Params::get_default(); + spark::Address tmp(sp); + try { tmp.decode(name_[0]=='@' ? name_.substr(1) : name_); isSparkAddress = true; } catch (...) {} + }This aligns with sendtoaddress detection and avoids false negatives.
524-568: Help: parameter naming inconsistency (“comment-to” vs “comment_to”).Docs mix hyphen and underscore. Align with JSON key (“comment_to”) and mapValue[“to”] for consistency.
Please update the simple-format help to use “comment_to”.
941-947: Debug logs include full destination addresses and memos.These logs can leak sensitive data in node logs.
- Gate with a debug category or verbosity flag.
- Avoid logging full addresses; use shortened/hashed forms.
Example:-LogPrintf("Recipient %zu: %s for amount %s\n", i, outputs[i].address.encode(...), ...) +LogPrint("spark", "Recipient %zu: *** for amount %s\n", i, FormatMoney(outputs[i].v));Also applies to: 1041-1056, 4612-4644, 667-671, 701-706, 950-955
786-790: ”subtractFee” handling UX: default to false instead of erroring when omitted.Currently, some paths throw if subtractFee is absent. Consider defaulting to false for both transparent and Spark recipients.
- if (amountObj.exists("subtractFee")) - fSubtractFeeFromAmount = amountObj["subtractFee"].get_bool(); - else - throw JSONRPCError(..., "Invalid parameters, no subtractFee: "); + fSubtractFeeFromAmount = amountObj.exists("subtractFee") ? amountObj["subtractFee"].get_bool() : false;Apply similarly for Spark recipient parsing.
Also applies to: 1016-1020, 4421-4426
depends/packages/qt.mk (4)
52-59: Duplicate/contradictory flags — simplify to avoid confusion.Remove duplicates and rely on global flags unless OS-specific overrides are required.
-$(package)_config_opts = -no-egl +$(package)_config_opts = @@ -$(package)_config_opts += -no-egl @@ -$(package)_config_opts += -no-zstd @@ -$(package)_config_opts += -no-opengl @@ -$(package)_config_opts += -no-feature-printsupport @@ -$(package)_config_opts_darwin += -no-opengl @@ -$(package)_config_opts_linux += -fontconfig -$(package)_config_opts_linux += -no-opengl -$(package)_config_opts_linux += -no-feature-vulkan -$(package)_config_opts_linux += -dbus-runtime +$(package)_config_opts_linux += -no-feature-vulkanNotes:
- Keep one of -no-egl, one of -no-zstd.
- Keep global -no-opengl; drop OS-specific duplicates unless you need per‑OS divergence.
- In linux opts, remove the repeated -fontconfig and -dbus-runtime.
- printsupport is already globally disabled; drop the darwin duplicate.
Also applies to: 64-65, 72-73, 120-122, 139-141, 166-167, 170-171, 177-180
12-22: Patches declared vs applied diverge — unify to avoid drift.You declare
$(package)_patches but hardcode patch -p1 calls (and apply qttools_skip_dependencies only on native). Consider driving preprocess from $ (package)_patches with conditional logic to keep the list single-sourced.-define $(package)_preprocess_cmds - patch -p1 -i $($(package)_patch_dir)/dont_hardcode_pwd.patch && \ - patch -p1 -i $($(package)_patch_dir)/qtbase-moc-ignore-gcc-macro.patch && \ - patch -p1 -i $($(package)_patch_dir)/qtbase_avoid_native_float16.patch && \ - patch -p1 -i $($(package)_patch_dir)/qtbase_avoid_qmain.patch && \ - patch -p1 -i $($(package)_patch_dir)/qtbase_platformsupport.patch && \ - patch -p1 -i $($(package)_patch_dir)/qtbase_plugins_cocoa.patch && \ - patch -p1 -i $($(package)_patch_dir)/qtbase_skip_tools.patch && \ - patch -p1 -i $($(package)_patch_dir)/rcc_hardcode_timestamp.patch -endef -ifeq ($(host),$(build)) - $(package)_preprocess_cmds += && patch -p1 -i $($(package)_patch_dir)/qttools_skip_dependencies.patch -endif +define $(package)_preprocess_cmds + set -e; \ + for p in $($(package)_patches); do \ + case "$$p" in \ + qttools_skip_dependencies.patch) [ "$(host)" = "$(build)" ] || continue ;; \ + esac; \ + patch -p1 -i "$($(package)_patch_dir)/$$p"; \ + done +endefAlso applies to: 329-342
240-247: Guix block mutates “_darwin” env for non‑darwin hosts.Likely a noop (since _darwin env isn’t used), but confusing. Either target $(package)_config_env here or add a short comment.
196-205: Redundant xcb enables; keep one mechanism.You enable xcb via both CMake (-DQT_FEATURE_xcb=ON) and configure (-feature-xcb). One is enough; prefer configure’s -feature-xcb for symmetry with other flags.
-$(package)_cmake_opts += -DQT_FEATURE_xcb=ON -endif +endif @@ -$(package)_config_opts_linux += -feature-xcb +$(package)_config_opts_linux += -feature-xcb(Keep only the configure flag or only the CMake option; remove the other.)
Also applies to: 236-238, 171-175, 181-182
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
-
.github/workflows/ci-master.ymlis excluded by!**/*.yml
📒 Files selected for processing (101)
-
CMakeLists.txt(1 hunks) -
Dockerfile(2 hunks) -
Makefile.am(0 hunks) -
README.md(7 hunks) -
autogen.sh(0 hunks) -
build-aux/m4/ax_boost_base.m4(0 hunks) -
build-aux/m4/ax_boost_chrono.m4(0 hunks) -
build-aux/m4/ax_boost_filesystem.m4(0 hunks) -
build-aux/m4/ax_boost_program_options.m4(0 hunks) -
build-aux/m4/ax_boost_system.m4(0 hunks) -
build-aux/m4/ax_boost_thread.m4(0 hunks) -
build-aux/m4/ax_boost_unit_test_framework.m4(0 hunks) -
build-aux/m4/ax_check_compile_flag.m4(0 hunks) -
build-aux/m4/ax_check_link_flag.m4(0 hunks) -
build-aux/m4/ax_check_preproc_flag.m4(0 hunks) -
build-aux/m4/ax_cxx_compile_stdcxx.m4(0 hunks) -
build-aux/m4/ax_gcc_func_attribute.m4(0 hunks) -
build-aux/m4/ax_pthread.m4(0 hunks) -
build-aux/m4/ax_subdirs_configure.m4(0 hunks) -
build-aux/m4/bitcoin_find_bdb48.m4(0 hunks) -
build-aux/m4/bitcoin_qt.m4(0 hunks) -
build-aux/m4/bitcoin_subdir_to_include.m4(0 hunks) -
build-aux/m4/l_atomic.m4(0 hunks) -
cmake/module/FindQt.cmake(0 hunks) -
configure.ac(0 hunks) -
contrib/debian/changelog(1 hunks) -
contrib/debian/rules(1 hunks) -
contrib/guix/libexec/build.sh(1 hunks) -
contrib/rpm/README.md(1 hunks) -
contrib/rpm/bitcoin.spec(3 hunks) -
depends/Makefile(3 hunks) -
depends/config.site.in(1 hunks) -
depends/funcs.mk(7 hunks) -
depends/hosts/darwin.mk(1 hunks) -
depends/packages/libxcb_util_cursor.mk(1 hunks) -
depends/packages/native_qt.mk(1 hunks) -
depends/packages/packages.mk(1 hunks) -
depends/packages/qt.mk(2 hunks) -
depends/packages/qt_details.mk(1 hunks) -
depends/packages/zeromq.mk(1 hunks) -
depends/patches/native_qt/dont_hardcode_pwd.patch(1 hunks) -
depends/patches/native_qt/qtbase-moc-ignore-gcc-macro.patch(1 hunks) -
depends/patches/native_qt/qtbase_avoid_native_float16.patch(1 hunks) -
depends/patches/native_qt/qtbase_skip_tools.patch(1 hunks) -
depends/patches/native_qt/qttools_skip_dependencies.patch(1 hunks) -
depends/patches/native_qt/rcc_hardcode_timestamp.patch(1 hunks) -
depends/patches/qt/dont_hardcode_pwd.patch(1 hunks) -
depends/patches/qt/memory_resource.patch(0 hunks) -
depends/patches/qt/qtbase-moc-ignore-gcc-macro.patch(1 hunks) -
depends/patches/qt/qtbase_avoid_native_float16.patch(1 hunks) -
depends/patches/qt/qtbase_avoid_qmain.patch(1 hunks) -
depends/patches/qt/qtbase_platformsupport.patch(1 hunks) -
depends/patches/qt/qtbase_plugins_cocoa.patch(1 hunks) -
depends/patches/qt/qtbase_skip_tools.patch(1 hunks) -
depends/patches/qt/qttools_skip_dependencies.patch(1 hunks) -
depends/patches/qt/rcc_hardcode_timestamp.patch(1 hunks) -
depends/toolchain.cmake.in(3 hunks) -
doc/build-unix.md(5 hunks) -
doc/man/Makefile.am(0 hunks) -
libbitcoinconsensus.pc.in(0 hunks) -
qa/rpc-tests/wallet.py(1 hunks) -
src/Makefile.am(0 hunks) -
src/crypto/MerkleTreeProof/blake2/blake2.h(1 hunks) -
src/crypto/MerkleTreeProof/blake2/blake2b.c(1 hunks) -
src/init.cpp(2 hunks) -
src/leveldb/include/leveldb/env.h(1 hunks) -
src/leveldb/util/env.cc(1 hunks) -
src/leveldb/util/env_posix.cc(3 hunks) -
src/leveldb/util/env_windows.cc(3 hunks) -
src/qt/CMakeLists.txt(8 hunks) -
src/qt/automintmodel.cpp(2 hunks) -
src/qt/bitcoin.cpp(4 hunks) -
src/qt/bitcoinamountfield.cpp(1 hunks) -
src/qt/bitcoingui.cpp(8 hunks) -
src/qt/clientmodel.cpp(3 hunks) -
src/qt/coincontroldialog.cpp(1 hunks) -
src/qt/guiutil.cpp(5 hunks) -
src/qt/notificator.cpp(3 hunks) -
src/qt/recentrequeststablemodel.cpp(1 hunks) -
src/qt/recentrequeststablemodel.h(1 hunks) -
src/qt/rpcconsole.cpp(3 hunks) -
src/qt/sendcoinsentry.cpp(2 hunks) -
src/qt/splashscreen.cpp(1 hunks) -
src/qt/test/CMakeLists.txt(1 hunks) -
src/qt/transactionfilterproxy.cpp(1 hunks) -
src/qt/transactiontablemodel.cpp(2 hunks) -
src/qt/utilitydialog.cpp(2 hunks) -
src/qt/winshutdownmonitor.cpp(1 hunks) -
src/qt/winshutdownmonitor.h(1 hunks) -
src/rpc/client.cpp(1 hunks) -
src/secp256k1/Makefile.am(0 hunks) -
src/secp256k1/build-aux/m4/ax_jni_include_dir.m4(0 hunks) -
src/secp256k1/build-aux/m4/ax_prog_cc_for_build.m4(0 hunks) -
src/secp256k1/build-aux/m4/bitcoin_secp.m4(0 hunks) -
src/secp256k1/build-aux/m4/m4_ax_cxx_compile_stdcxx.m4(0 hunks) -
src/secp256k1/configure.ac(0 hunks) -
src/univalue/Makefile.am(0 hunks) -
src/univalue/autogen.sh(0 hunks) -
src/univalue/build-aux/m4/.gitignore(0 hunks) -
src/univalue/configure.ac(0 hunks) -
src/wallet/rpcwallet.cpp(9 hunks)
💤 Files with no reviewable changes (36)
- cmake/module/FindQt.cmake
- build-aux/m4/ax_boost_thread.m4
- src/univalue/configure.ac
- libbitcoinconsensus.pc.in
- src/secp256k1/build-aux/m4/bitcoin_secp.m4
- build-aux/m4/ax_boost_system.m4
- build-aux/m4/ax_pthread.m4
- src/Makefile.am
- build-aux/m4/ax_check_preproc_flag.m4
- build-aux/m4/ax_boost_unit_test_framework.m4
- build-aux/m4/l_atomic.m4
- build-aux/m4/ax_check_link_flag.m4
- Makefile.am
- build-aux/m4/bitcoin_subdir_to_include.m4
- build-aux/m4/ax_gcc_func_attribute.m4
- src/secp256k1/build-aux/m4/ax_jni_include_dir.m4
- depends/patches/qt/memory_resource.patch
- build-aux/m4/ax_boost_filesystem.m4
- build-aux/m4/ax_subdirs_configure.m4
- src/univalue/build-aux/m4/.gitignore
- build-aux/m4/ax_boost_chrono.m4
- doc/man/Makefile.am
- autogen.sh
- build-aux/m4/ax_cxx_compile_stdcxx.m4
- build-aux/m4/bitcoin_qt.m4
- src/secp256k1/configure.ac
- src/univalue/autogen.sh
- src/univalue/Makefile.am
- configure.ac
- src/secp256k1/build-aux/m4/m4_ax_cxx_compile_stdcxx.m4
- build-aux/m4/ax_boost_program_options.m4
- src/secp256k1/build-aux/m4/ax_prog_cc_for_build.m4
- build-aux/m4/bitcoin_find_bdb48.m4
- build-aux/m4/ax_boost_base.m4
- src/secp256k1/Makefile.am
- build-aux/m4/ax_check_compile_flag.m4
🧰 Additional context used
🪛 GitHub Actions: build Dockerfile
Dockerfile
[error] 19-19: Failed to build Docker image: the command 'cd depends && NO_QT=true make HOST=$(uname -m)-linux-gnu -j$(nproc)' exited with code 2.
🪛 Ruff (0.14.0)
qa/rpc-tests/wallet.py
278-278: assert_equal may be undefined, or defined from star imports
(F405)
278-278: Decimal may be undefined, or defined from star imports
(F405)
292-292: assert_equal may be undefined, or defined from star imports
(F405)
292-292: Decimal may be undefined, or defined from star imports
(F405)
293-293: assert_equal may be undefined, or defined from star imports
(F405)
294-294: assert_equal may be undefined, or defined from star imports
(F405)
308-308: Decimal may be undefined, or defined from star imports
(F405)
318-318: assert_equal may be undefined, or defined from star imports
(F405)
318-318: Decimal may be undefined, or defined from star imports
(F405)
329-329: assert_equal may be undefined, or defined from star imports
(F405)
329-329: Decimal may be undefined, or defined from star imports
(F405)
337-337: assert_equal may be undefined, or defined from star imports
(F405)
337-337: Decimal may be undefined, or defined from star imports
(F405)
350-350: Do not assert False (python -O removes these calls), raise AssertionError()
Replace assert False
(B011)
351-351: JSONRPCException may be undefined, or defined from star imports
(F405)
362-362: Do not assert False (python -O removes these calls), raise AssertionError()
Replace assert False
(B011)
363-363: JSONRPCException may be undefined, or defined from star imports
(F405)
371-371: Do not assert False (python -O removes these calls), raise AssertionError()
Replace assert False
(B011)
372-372: JSONRPCException may be undefined, or defined from star imports
(F405)
380-380: Do not assert False (python -O removes these calls), raise AssertionError()
Replace assert False
(B011)
381-381: JSONRPCException may be undefined, or defined from star imports
(F405)
389-389: Do not assert False (python -O removes these calls), raise AssertionError()
Replace assert False
(B011)
390-390: JSONRPCException may be undefined, or defined from star imports
(F405)
398-398: Do not assert False (python -O removes these calls), raise AssertionError()
Replace assert False
(B011)
399-399: JSONRPCException may be undefined, or defined from star imports
(F405)
412-412: Do not assert False (python -O removes these calls), raise AssertionError()
Replace assert False
(B011)
413-413: JSONRPCException may be undefined, or defined from star imports
(F405)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (5)
- GitHub Check: build-guix-arm64-apple-darwin
- GitHub Check: build-guix-aarch64-linux-gnu
- GitHub Check: build-guix-x86_64-w64-mingw32
- GitHub Check: build-guix-x86_64-apple-darwin
- GitHub Check: build-guix-x86_64-linux-gnu
contrib/debian/changelog
Outdated
| * Backport updated bitcoin-qt.desktop from upstream master | ||
| * Add zmq dependency | ||
| * Switch to Qt5 (breaks precise, but that was already broken by C++11) | ||
| * Switch to Qt6 (breaks precise, but that was already broken by C++11) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do not modify historical changelog entries.
Editing a changelog entry from 2016 to say "Qt6" is incorrect and breaks the historical record. Debian changelog entries should be immutable. Instead, add a new entry at the top documenting the Qt5→Qt6 migration with the current date and version.
🤖 Prompt for AI Agents
contrib/debian/changelog around line 48: the review points out that a historical
2016 changelog entry was incorrectly edited to say "Switch to Qt6"; changelog
history must be immutable. Revert that line to its original historical content
and instead create a new changelog entry at the top of the file with the current
date/version describing the Qt5→Qt6 migration, including a clear version header,
urgency, and your changelog message and maintainer signature so the change is
recorded properly without altering past entries.
contrib/rpm/bitcoin.spec
Outdated
| - When building the Qt package, default to Qt6 but allow building | ||
| - against Qt4 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do not modify historical changelog entries.
Similar to the Debian changelog, this RPM changelog entry appears to be historical and should not be edited. Add a new changelog entry at the top to document the Qt6 migration instead of altering past records.
🤖 Prompt for AI Agents
In contrib/rpm/bitcoin.spec around lines 432 to 433, do not alter the existing
historical changelog entry; instead add a new changelog entry at the top of the
%changelog documenting the Qt6 migration (date, your name/email, brief
description that Qt package now defaults to Qt6 with option to build against
Qt4, and any relevant RPM tag/version), leaving the original lines unchanged.
Ensure the new entry is the first entry in the %changelog block and follows the
surrounding changelog formatting used in the file.
| $(eval $(1)_dependencies += $($(1)_$(host_arch)_$(host_os)_dependencies) $($(1)_$(host_os)_dependencies)) | ||
| $(eval $(1)_all_dependencies:=$(call int_get_all_dependencies,$(1),$($($(1)_type)_native_toolchain) $($(1)_dependencies))) | ||
| # $(eval $(1)_all_dependencies:=$(call int_get_all_dependencies,$(1),$($(1)_dependencies))) | ||
| # $(eval $(1)_all_dependencies:=$(call int_get_all_dependencies,$(1),$($($(1)_type)_native_toolchain) $($(1)_dependencies))) | ||
| $(foreach dep,$($(1)_all_dependencies),$(eval $(1)_build_id_deps+=$(dep)-$($(dep)_version)-$($(dep)_recipe_hash))) | ||
| $(eval $(1)_build_id_long:=$(1)-$($(1)_version)-$($(1)_recipe_hash)-$(release_type) $($(1)_build_id_deps) $($($(1)_type)_id_string)) | ||
| # $(eval $(1)_build_id_long:=$(1)-$($(1)_version)-$($(1)_recipe_hash)-$(release_type) $($(1)_build_id_deps) $($($(1)_type)_id)) | ||
| $(eval $(1)_build_id:=$(shell echo -n "$($(1)_build_id_long)" | $(build_SHA256SUM) | cut -c-$(HASH_LENGTH))) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Build IDs no longer include dependencies ($(1)_all_dependencies never computed).
You commented out the lines that fill $(1)_all_dependencies, but still iterate over it. Result: cache keys ignore dependent packages and won’t invalidate when deps change.
Apply this to restore transitive dep aggregation:
define int_get_build_id
-$(eval $(1)_dependencies += $($(1)_$(host_arch)_$(host_os)_dependencies) $($(1)_$(host_os)_dependencies))
-# $(eval $(1)_all_dependencies:=$(call int_get_all_dependencies,$(1),$($(1)_dependencies)))
-# $(eval $(1)_all_dependencies:=$(call int_get_all_dependencies,$(1),$($($(1)_type)_native_toolchain) $($(1)_dependencies)))
+$(eval $(1)_dependencies += $($(1)_$(host_arch)_$(host_os)_dependencies) $($(1)_$(host_os)_dependencies))
+$(eval $(1)_all_dependencies:=$(call int_get_all_dependencies,$(1),$($($(1)_type)_native_toolchain) $($(1)_dependencies)))
$(foreach dep,$($(1)_all_dependencies),$(eval $(1)_build_id_deps+=$(dep)-$($(dep)_version)-$($(dep)_recipe_hash)))🤖 Prompt for AI Agents
In depends/funcs.mk around lines 83-89, the variable $(1)_all_dependencies is
never computed because the lines that populate it were commented out, so the
subsequent foreach over $(1)_all_dependencies produces empty dependency lists
and build IDs ignore transitive deps; restore and uncomment the appropriate
computation of $(1)_all_dependencies (prefer the variant that passes the
package's native toolchain plus its direct dependencies: call
int_get_all_dependencies with $($($(1)_type)_native_toolchain)
$($(1)_dependencies)), ensure this line appears before the foreach that builds
$(1)_build_id_deps, and remove or keep only one correct active definition so
transitive dependencies are included in the build ID calculation.
depends/funcs.mk
Outdated
| cmake -G "Unix Makefiles" \ | ||
| -DCMAKE_INSTALL_PREFIX:PATH="$$($($(1)_type)_prefix)" \ | ||
| -DCMAKE_AR=`which $$($(1)_ar)` \ | ||
| -DCMAKE_NM=`which $$($(1)_nm)` \ | ||
| -DCMAKE_RANLIB=`which $$($(1)_ranlib)` \ | ||
| -DCMAKE_INSTALL_LIBDIR=lib/ \ | ||
| -DCMAKE_POSITION_INDEPENDENT_CODE=ON \ | ||
| -DCMAKE_VERBOSE_MAKEFILE:BOOL=$(V) \ | ||
| $$($(1)_config_opts) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
CMake tool vars: avoid backticked which; guard when unset; fix verbose flag.
Using which $$($(1)_ar) fails if var is empty or already a path; also -DCMAKE_VERBOSE_MAKEFILE:BOOL=$(V) passes an empty value.
Apply:
- cmake -G "Unix Makefiles" \
- -DCMAKE_INSTALL_PREFIX:PATH="$$($($(1)_type)_prefix)" \
- -DCMAKE_AR=`which $$($(1)_ar)` \
- -DCMAKE_NM=`which $$($(1)_nm)` \
- -DCMAKE_RANLIB=`which $$($(1)_ranlib)` \
- -DCMAKE_INSTALL_LIBDIR=lib/ \
- -DCMAKE_POSITION_INDEPENDENT_CODE=ON \
- -DCMAKE_VERBOSE_MAKEFILE:BOOL=$(V) \
- $$($(1)_config_opts)
+ cmake -G "Unix Makefiles" \
+ -DCMAKE_INSTALL_PREFIX:PATH="$$($($(1)_type)_prefix)" \
+ -DCMAKE_INSTALL_LIBDIR=lib \
+ -DCMAKE_POSITION_INDEPENDENT_CODE=ON \
+ -DCMAKE_VERBOSE_MAKEFILE:BOOL=$(if $(strip $(V)),ON,OFF) \
+ $$($(1)_config_opts)
+
+# Only set tool paths if defined
+ifneq ($($(1)_ar),)
+$(1)_cmake += -DCMAKE_AR="$$($(1)_ar)"
+endif
+ifneq ($($(1)_nm),)
+$(1)_cmake += -DCMAKE_NM="$$($(1)_nm)"
+endif
+ifneq ($($(1)_ranlib),)
+$(1)_cmake += -DCMAKE_RANLIB="$$($(1)_ranlib)"
+endifAlso applies to: 265-267
🤖 Prompt for AI Agents
In depends/funcs.mk around lines 263 to 271 (also apply same change to 265-267):
the recipe uses backticked `which $$($(1)_ar)`/`which $$($(1)_nm)`/`which
$$($(1)_ranlib)` which fails if the variable is empty or already a full path,
and it passes an empty value for CMAKE_VERBOSE_MAKEFILE when V is not set;
replace the backticks with guarded resolution that uses the variable value if it
already contains a path or falls back to a safe which call, e.g. compute a
helper like AR_PATH := $(if $(filter %/%, $($(1)_ar)),$($(1)_ar),$(or $(shell
which $($(1)_ar) 2>/dev/null),)) and use -DCMAKE_AR="$(AR_PATH)" (do similarly
for NM and RANLIB), and change the verbose flag to
-DCMAKE_VERBOSE_MAKEFILE:BOOL=$(if $(V),ON,OFF) so it is explicitly ON or OFF.
depends/hosts/darwin.mk
Outdated
| darwin_CXXFLAGS += -mlinker-version=$(LLD_VERSION) -nostdinc++ -isystem$(OSX_SDK)/usr/include/c++/v1 -fuse-ld=lld | ||
| darwin_LDFLAGS += -Wl,-syslibroot,$(OSX_SDK) -Wl,-platform_version,macos,$(OSX_MIN_VERSION),$(OSX_SDK_VERSION) -Wl,-no_adhoc_codesign -fuse-ld=lld |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
Avoid duplicating platform_version linker flag in cross-compilation path.
Line 65 establishes darwin_LDFLAGS with -Wl,-platform_version,macos,$(OSX_MIN_VERSION),$(OSX_SDK_VERSION). Line 70 appends the same flag again via +=, resulting in duplicate platform_version specifications when cross-compiling for Darwin from non-Darwin hosts. This duplication is likely unintentional.
Refactor to eliminate the duplication:
-darwin_LDFLAGS += -Wl,-syslibroot,$(OSX_SDK) -Wl,-platform_version,macos,$(OSX_MIN_VERSION),$(OSX_SDK_VERSION) -Wl,-no_adhoc_codesign -fuse-ld=lld
+darwin_LDFLAGS += -Wl,-syslibroot,$(OSX_SDK) -Wl,-no_adhoc_codesign -fuse-ld=lldAlso, consider documenting the new flags being added to CXXFLAGS and LDFLAGS (lines 27–52) for maintainability. The flag explanations currently don't cover -nostdinc++, -isystem, or -syslibroot.
Run this to confirm the flags are not duplicated when cross-compiling:
🏁 Script executed:
#!/bin/bash
# Extract final LDFLAGS when cross-compiling (simulating build_os != darwin)
make_output=$(make -n -f depends/hosts/darwin.mk | grep -E "darwin_LDFLAGS|darwin_CXXFLAGS" 2>/dev/null || echo "Unable to extract; verify manually via: grep -n 'darwin_LDFLAGS\|darwin_CXXFLAGS' depends/hosts/darwin.mk")
echo "$make_output"Length of output: 148
[/run_scripts]
🏁 Script executed:
#!/bin/bash
# Extract lines around 65 and 70 to verify the duplication claim
echo "=== Lines 60-75 of depends/hosts/darwin.mk ==="
sed -n '60,75p' depends/hosts/darwin.mk | cat -nLength of output: 961
Avoid duplicating platform_version linker flag in cross-compilation path.
Line 65 establishes darwin_LDFLAGS=-Wl,-platform_version,macos,$(OSX_MIN_VERSION),$(OSX_SDK_VERSION). Line 70 appends this exact flag again via += when cross-compiling for Darwin from non-Darwin hosts, resulting in duplicate platform_version specifications.
Refactor to eliminate the duplication:
-darwin_LDFLAGS += -Wl,-syslibroot,$(OSX_SDK) -Wl,-platform_version,macos,$(OSX_MIN_VERSION),$(OSX_SDK_VERSION) -Wl,-no_adhoc_codesign -fuse-ld=lld
+darwin_LDFLAGS += -Wl,-syslibroot,$(OSX_SDK) -Wl,-no_adhoc_codesign -fuse-ld=lldAlso consider documenting the new flags being added to CXXFLAGS and LDFLAGS (lines 27–52) for maintainability. The flag explanations currently don't cover -nostdinc++, -isystem, or -syslibroot.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| darwin_CXXFLAGS += -mlinker-version=$(LLD_VERSION) -nostdinc++ -isystem$(OSX_SDK)/usr/include/c++/v1 -fuse-ld=lld | |
| darwin_LDFLAGS += -Wl,-syslibroot,$(OSX_SDK) -Wl,-platform_version,macos,$(OSX_MIN_VERSION),$(OSX_SDK_VERSION) -Wl,-no_adhoc_codesign -fuse-ld=lld | |
| darwin_CXXFLAGS += -mlinker-version=$(LLD_VERSION) -nostdinc++ -isystem$(OSX_SDK)/usr/include/c++/v1 -fuse-ld=lld | |
| darwin_LDFLAGS += -Wl,-syslibroot,$(OSX_SDK) -Wl,-no_adhoc_codesign -fuse-ld=lld |
src/qt/automintmodel.cpp
Outdated
| std::vector<std::pair<CAmount, std::vector<COutput>>> valueAndUTXOs; | ||
| pwalletMain->AvailableCoinsForLMint(valueAndUTXOs, &coinControl); | ||
|
|
||
| for (auto const &valueAndUTXO : valueAndUTXOs) { | ||
| credit += valueAndUTXO.first; | ||
| for (const auto& p : valueAndUTXOs) { | ||
| credit += p.first; | ||
| } | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Use the instance wallet, not pwalletMain
This path uses pwalletMain while the rest of the method uses the passed-in wallet. In multi-wallet or null-global scenarios this can be wrong or crash.
- std::vector<std::pair<CAmount, std::vector<COutput>>> valueAndUTXOs;
- pwalletMain->AvailableCoinsForLMint(valueAndUTXOs, &coinControl);
+ std::vector<std::pair<CAmount, std::vector<COutput>>> valueAndUTXOs;
+ wallet->AvailableCoinsForLMint(valueAndUTXOs, &coinControl);📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| std::vector<std::pair<CAmount, std::vector<COutput>>> valueAndUTXOs; | |
| pwalletMain->AvailableCoinsForLMint(valueAndUTXOs, &coinControl); | |
| for (auto const &valueAndUTXO : valueAndUTXOs) { | |
| credit += valueAndUTXO.first; | |
| for (const auto& p : valueAndUTXOs) { | |
| credit += p.first; | |
| } | |
| } | |
| std::vector<std::pair<CAmount, std::vector<COutput>>> valueAndUTXOs; | |
| wallet->AvailableCoinsForLMint(valueAndUTXOs, &coinControl); | |
| for (const auto& p : valueAndUTXOs) { | |
| credit += p.first; | |
| } | |
| } |
🤖 Prompt for AI Agents
In src/qt/automintmodel.cpp around lines 104 to 109, the code calls
pwalletMain->AvailableCoinsForLMint(...) while the rest of the function uses the
passed-in wallet instance; replace the global pwalletMain with the provided
wallet object (e.g. wallet->AvailableCoinsForLMint(...)) so the
multi-wallet/null-global case uses the correct instance, and ensure you
null-check the wallet pointer before calling its method if the function can be
invoked with a null wallet.
src/qt/guiutil.cpp
Outdated
| return dateTimeStr(QDateTime::fromSecsSinceEpoch((qint32)nTime)); | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove 32‑bit cast to avoid Y2038 truncation
QDateTime::fromSecsSinceEpoch accepts 64‑bit; casting to qint32 truncates future timestamps.
- return dateTimeStr(QDateTime::fromSecsSinceEpoch((qint32)nTime));
+ return dateTimeStr(QDateTime::fromSecsSinceEpoch(nTime));📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| return dateTimeStr(QDateTime::fromSecsSinceEpoch((qint32)nTime)); | |
| } | |
| return dateTimeStr(QDateTime::fromSecsSinceEpoch(nTime)); | |
| } |
🤖 Prompt for AI Agents
In src/qt/guiutil.cpp around lines 100 to 101, remove the 32-bit cast that
truncates timestamps: stop casting nTime to qint32 and pass a 64-bit value to
QDateTime::fromSecsSinceEpoch (use qint64 or the variable's existing 64-bit
type, or static_cast<qint64>(nTime) if needed) so future timestamps are
preserved and Y2038 truncation is avoided.
| int FreedesktopImage::metaType() | ||
| { | ||
| return qDBusRegisterMetaType<FreedesktopImage>(); | ||
| return qDBusRegisterMetaType<FreedesktopImage>().id(); | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fix metaType(): qDBusRegisterMetaType returns int; .id() causes a compile error
Return the Qt metatype id without calling .id() on an int. Also ensure DBus marshalling is registered once.
Apply:
int FreedesktopImage::metaType()
{
- return qDBusRegisterMetaType<FreedesktopImage>().id();
+ // Register D-Bus marshalling and return the Qt metatype id
+ qDBusRegisterMetaType<FreedesktopImage>();
+#if QT_VERSION >= QT_VERSION_CHECK(6, 0, 0)
+ return QMetaType::fromType<FreedesktopImage>().id();
+#else
+ return qMetaTypeId<FreedesktopImage>();
+#endif
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| int FreedesktopImage::metaType() | |
| { | |
| return qDBusRegisterMetaType<FreedesktopImage>(); | |
| return qDBusRegisterMetaType<FreedesktopImage>().id(); | |
| } | |
| int FreedesktopImage::metaType() | |
| { | |
| // Register D-Bus marshalling and return the Qt metatype id | |
| qDBusRegisterMetaType<FreedesktopImage>(); | |
| #if QT_VERSION >= QT_VERSION_CHECK(6, 0, 0) | |
| return QMetaType::fromType<FreedesktopImage>().id(); | |
| #else | |
| return qMetaTypeId<FreedesktopImage>(); | |
| #endif | |
| } |
🤖 Prompt for AI Agents
In src/qt/notificator.cpp around lines 149-152, change metaType() so it does not
call .id() on the int returned by qDBusRegisterMetaType; instead register once
and return that int. Implement it as a function-local static int (e.g. static
int id = qDBusRegisterMetaType<FreedesktopImage>();) and return id to ensure
DBus marshalling is registered only once and avoid the compile error.
src/qt/splashscreen.cpp
Outdated
| #include <QApplication> | ||
| #include <QCloseEvent> | ||
| #include <QDesktopWidget> | ||
| #include <QScreen> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
Verify if the QScreen include is needed.
The QScreen header is included as part of the Qt6 migration, but there's no visible usage of QScreen in the active code. The commented-out code at line 180 would need QScreen for QApplication::desktop()->screenGeometry() replacement, but since that code is inactive, this include may be unnecessary.
🏁 Script executed:
#!/bin/bash
# Check if QScreen is used anywhere in the active code
rg -n "QScreen" src/qt/splashscreen.cpp -A2 -B2Length of output: 127
I need to search more comprehensively to determine if QScreen is actually used in the file.
---
🏁 Script executed:
```shell
#!/bin/bash
# Search for all QScreen usage in the entire file (not just context around includes)
echo "=== All QScreen references ==="
rg "QScreen" src/qt/splashscreen.cpp
echo ""
echo "=== Code around line 180 (commented-out code) ==="
sed -n '175,185p' src/qt/splashscreen.cpp
Length of output: 492
Remove the unused QScreen include at line 24.
The search confirms QScreen is only present as the include statement and has zero usage in the active code. The commented-out code at line 180 does not use QScreen—it still references the legacy Qt5 API (QApplication::desktop()->screenGeometry()). Since neither the active code nor the inactive code relies on it, the include should be removed.
🤖 Prompt for AI Agents
In src/qt/splashscreen.cpp around line 24, the #include <QScreen> is unused and
should be removed; delete that include line so only necessary headers remain
(confirm no active code or commented-out sections rely on QScreen) and run a
build/compile to ensure no missing dependencies.
d8dcc6e to
311f8dc
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 11
♻️ Duplicate comments (1)
src/wallet/rpcwallet.cpp (1)
912-918: Remove Spark-balance check in sendtoaddress (JSON mode). Transparent funds are used to mint for Spark recipients.This wrongly requires Spark balance and can fail even when transparent balance is sufficient. Delete this block and rely on the transparent-balance checks and the mint path that follows.
Apply this diff:
- // Spark checking - if (!sparkRecipients.empty()) { - EnsureSparkWalletIsAvailable(); - CAmount sparkBalance = pwallet->sparkWallet->getAvailableBalance(); - if (sparkBalance < totalSparkAmount) { - throw JSONRPCError(RPC_WALLET_INSUFFICIENT_FUNDS, "Insufficient Spark funds"); - } - } + // No Spark-balance check here: Spark recipients are funded by transparent mints.
🧹 Nitpick comments (26)
src/wallet/rpcwallet.cpp (4)
813-815: Avoid pre-validating fee subtraction with a transparent-sized fee estimate for Spark outputs.ValidateFeeSubtractionAmount is called before address-type resolution; the 230‑byte estimate is misleading for Spark. Remove the early call and validate only for transparent/RAP outputs.
Apply these diffs:
Remove the generic pre-check:
- // Validate that amount is reasonable when fee is being subtracted - ValidateFeeSubtractionAmount(nAmount, subtractFee, strAddress);Validate when adding transparent recipients:
CScript scriptPubKey = GetScriptForDestination(address.Get()); - CRecipient recipient = {scriptPubKey, nAmount, subtractFee, {}, {}}; + // Transparent-only fee subtraction sanity check + ValidateFeeSubtractionAmount(nAmount, subtractFee, resolvedAddress); + CRecipient recipient = {scriptPubKey, nAmount, subtractFee, {}, {}};Also applies to: 895-899
688-690: Reduce PII in logs (addresses, memos, amounts).LogPrintf includes full addresses/memos. Prefer redaction or debug-only logs to avoid leaking sensitive info.
Also applies to: 705-706, 711-712, 721-723, 724-728, 1041-1044, 1079-1081
4744-4749: Brittle Spark detection in sendsparkmany.Using string prefixes (“st”, “@”) to infer Spark is fragile. Reuse spark::Address decode (as done elsewhere) to decide if the memo should be attached.
Example change:
- bool isSparkAddress = false; - if (name_.length() >= 2 && name_.substr(0, 2) == "st") { - isSparkAddress = true; - } else if (!name_.empty() && name_[0] == '@') { - isSparkAddress = true; // Spark name - } + bool isSparkAddress = false; + try { + spark::Address tmp(spark::Params::get_default()); + std::string resolved = name_; + if (!name_.empty() && name_[0] == '@') { + // keep consistent with earlier resolution + resolved = name_; + } + tmp.decode(resolved); + isSparkAddress = true; + } catch (const std::exception&) { + isSparkAddress = false; + }
1934-1944: Avoid mutating wtx.mapValue for comment detection in sendmany converter.Accessing wtx.mapValue["comment"] inserts an empty key. Use a local std::string to carry the comment instead of probing the map.
qa/rpc-tests/spendspark.py (3)
32-39: Preserve exception context when asserting activation failure (ruff B904).Use exception chaining:
- except JSONRPCException as e: - if e.error.get('code') == -4: - raise AssertionError("Spark is not activated even after generating blocks.") - else: - raise + except JSONRPCException as e: + if e.error.get('code') == -4: + raise AssertionError("Spark is not activated even after generating blocks.") from e + raise
388-389: Make Decimal comparison tolerant.threshold=0 enforces exact equality and can be brittle. Prefer a small tolerance.
- def assert_equal_float(self, actual: Decimal, expected: Decimal, threshold=0e-20): + def assert_equal_float(self, actual: Decimal, expected: Decimal, threshold=Decimal('1e-8')):
3-3: Remove unused import or use it.enable_mocktime is imported but never used.
-from test_framework.util import JSONRPCException, enable_mocktime +from test_framework.util import JSONRPCExceptionOr call enable_mocktime() in setUp if needed for determinism.
qa/rpc-tests/sendsparkmany.py (4)
35-39: Chain the exception when asserting activation failure.- except JSONRPCException as e: - if e.error.get('code') == -4: - raise AssertionError("Spark is not activated even after generating blocks.") - else: - raise + except JSONRPCException as e: + if e.error.get('code') == -4: + raise AssertionError("Spark is not activated even after generating blocks.") from e + raise
6-6: Use CapWords for class name (PEP 8).Rename sendsparkmanyTest → SendSparkManyTest for consistency with other tests.
-class sendsparkmanyTest(BitcoinTestFramework): +class SendSparkManyTest(BitcoinTestFramework):Also update the main guard accordingly.
70-86: Test names mention “transparent” but exercise Spark addresses.Method names like test_sendsparkmany_1transparent_* send to Spark addresses (per docstrings and calls). Rename to 1spark, 2spark, etc., or adjust docstrings to match intent.
Also applies to: 123-139, 168-190, 191-214
388-389: Loosen float-equality tolerance.Use a small Decimal epsilon instead of 0.
- def assert_equal_float(self, actual: Decimal, expected: Decimal, threshold=0e-20): + def assert_equal_float(self, actual: Decimal, expected: Decimal, threshold=Decimal('1e-8')):qa/rpc-tests/sendspark.py (2)
35-39: Chain exceptions for activation failure.- except JSONRPCException as e: - if e.error.get('code') == -4: - raise AssertionError("Spark is not activated even after generating blocks.") - else: - raise + except JSONRPCException as e: + if e.error.get('code') == -4: + raise AssertionError("Spark is not activated even after generating blocks.") from e + raise
388-389: Make comparisons tolerant.- def assert_equal_float(self, actual: Decimal, expected: Decimal, threshold=0e-20): + def assert_equal_float(self, actual: Decimal, expected: Decimal, threshold=Decimal('1e-8')):qa/rpc-tests/sendtoaddress.py (2)
324-341: “All parameters” for Spark but no memo provided.If memos are supported for Spark via sendtoaddress, add a memo field to truly cover “all” parameters; otherwise update the docstring.
507-511: Use Decimal epsilon for equality.- def assert_equal_float(self, actual: Decimal, expected: Decimal, threshold=0e-20): + def assert_equal_float(self, actual: Decimal, expected: Decimal, threshold=Decimal('1e-8')):qa/rpc-tests/sendtransparent.py (5)
29-37: Preserve original exception context.Chain the AssertionError to the caught JSONRPCException.
- except JSONRPCException as e: - if e.error.get('code') == -4: - raise AssertionError("Spark is not activated even after generating blocks.") - else: - raise + except JSONRPCException as e: + if e.error.get('code') == -4: + raise AssertionError("Spark is not activated even after generating blocks.") from e + raise
111-114: Make multi‑recipient checks order‑independent.details order isn’t guaranteed. Compare by address to avoid flakiness.
- for i, detail in enumerate(details): - self.assert_equal_float(-detail['amount'], expected_amounts[i]) + expected_by_addr = {address1: expected_amounts[0], address2: expected_amounts[1]} + assert len(details) == len(expected_by_addr) + for d in details: + self.assert_equal_float(-d['amount'], expected_by_addr[d['address']])
115-208: Dedent bodies for readability (no functional change).Several test bodies are indented one level too deep; align with surrounding tests.
266-268: Use a non‑zero epsilon.threshold=0 makes Decimal comparisons brittle.
- def assert_equal_float(self, actual: Decimal, expected: Decimal, threshold=0e-20): + def assert_equal_float(self, actual: Decimal, expected: Decimal, threshold=Decimal('1e-8')): assert abs(actual - expected) <= threshold, f"Values {actual} and {expected} differ by more than {threshold}"
12-37: DRY: share activate_spark across tests.activate_spark duplicates across files; move to a common helper to reduce drift.
qa/rpc-tests/sendmany.py (6)
29-37: Preserve original exception context.Chain AssertionError to JSONRPCException.
- except JSONRPCException as e: - if e.error.get('code') == -4: - raise AssertionError("Spark is not activated even after generating blocks.") - else: - raise + except JSONRPCException as e: + if e.error.get('code') == -4: + raise AssertionError("Spark is not activated even after generating blocks.") from e + raise
115-118: Make multi‑recipient checks order‑independent (transparent).Avoid depending on details order.
- for i, detail in enumerate(details): - self.assert_equal_float(-detail['amount'], expected_amounts[i]) + expected_by_addr = {address1: expected_amounts[0], address2: expected_amounts[1]} + assert len(details) == len(expected_by_addr) + for d in details: + self.assert_equal_float(-d['amount'], expected_by_addr[d['address']])
233-236: Make multi‑recipient checks order‑independent (spark).Same issue for spark recipients.
- for i, detail in enumerate(details): - self.assert_equal_float(-detail['amount'], expected_amounts[i]) + expected_by_addr = {spark_address1: expected_amounts[0], spark_address2: expected_amounts[1]} + assert len(details) == len(expected_by_addr) + for d in details: + self.assert_equal_float(-d['amount'], expected_by_addr[d['address']])
317-319: Use a non‑zero epsilon.Make Decimal comparisons tolerant.
- def assert_equal_float(self, actual: Decimal, expected: Decimal, threshold=0e-20): + def assert_equal_float(self, actual: Decimal, expected: Decimal, threshold=Decimal('1e-8')): assert abs(actual - expected) <= threshold, f"Values {actual} and {expected} differ by more than {threshold}"
212-223: Reduce duplication (1spark_all duplicates 1spark_with_fee_subtraction).These tests exercise the same path; consider parametrizing or removing one to speed up CI.
12-37: DRY: share activate_spark across test modules.Extract a shared helper to avoid divergence.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
.github/workflows/ci-master.ymlis excluded by!**/*.yml
📒 Files selected for processing (45)
Dockerfile(1 hunks)Makefile.am(0 hunks)README.md(7 hunks)autogen.sh(0 hunks)build-aux/m4/ax_boost_base.m4(0 hunks)build-aux/m4/ax_boost_chrono.m4(0 hunks)build-aux/m4/ax_boost_filesystem.m4(0 hunks)build-aux/m4/ax_boost_program_options.m4(0 hunks)build-aux/m4/ax_boost_system.m4(0 hunks)build-aux/m4/ax_boost_thread.m4(0 hunks)build-aux/m4/ax_boost_unit_test_framework.m4(0 hunks)build-aux/m4/ax_check_compile_flag.m4(0 hunks)build-aux/m4/ax_check_link_flag.m4(0 hunks)build-aux/m4/ax_check_preproc_flag.m4(0 hunks)build-aux/m4/ax_cxx_compile_stdcxx.m4(0 hunks)build-aux/m4/ax_gcc_func_attribute.m4(0 hunks)build-aux/m4/ax_pthread.m4(0 hunks)build-aux/m4/ax_subdirs_configure.m4(0 hunks)build-aux/m4/bitcoin_find_bdb48.m4(0 hunks)build-aux/m4/bitcoin_qt.m4(0 hunks)build-aux/m4/bitcoin_subdir_to_include.m4(0 hunks)build-aux/m4/l_atomic.m4(0 hunks)configure.ac(0 hunks)doc/build-unix.md(5 hunks)doc/man/Makefile.am(0 hunks)libbitcoinconsensus.pc.in(0 hunks)qa/pull-tester/rpc-tests.py(1 hunks)qa/rpc-tests/sendmany.py(1 hunks)qa/rpc-tests/sendspark.py(1 hunks)qa/rpc-tests/sendsparkmany.py(1 hunks)qa/rpc-tests/sendtoaddress.py(1 hunks)qa/rpc-tests/sendtransparent.py(1 hunks)qa/rpc-tests/spendspark.py(1 hunks)src/Makefile.am(0 hunks)src/secp256k1/Makefile.am(0 hunks)src/secp256k1/build-aux/m4/ax_jni_include_dir.m4(0 hunks)src/secp256k1/build-aux/m4/ax_prog_cc_for_build.m4(0 hunks)src/secp256k1/build-aux/m4/bitcoin_secp.m4(0 hunks)src/secp256k1/build-aux/m4/m4_ax_cxx_compile_stdcxx.m4(0 hunks)src/secp256k1/configure.ac(0 hunks)src/univalue/Makefile.am(0 hunks)src/univalue/autogen.sh(0 hunks)src/univalue/build-aux/m4/.gitignore(0 hunks)src/univalue/configure.ac(0 hunks)src/wallet/rpcwallet.cpp(13 hunks)
💤 Files with no reviewable changes (34)
- src/univalue/configure.ac
- build-aux/m4/ax_boost_chrono.m4
- build-aux/m4/ax_cxx_compile_stdcxx.m4
- build-aux/m4/ax_subdirs_configure.m4
- build-aux/m4/ax_check_link_flag.m4
- doc/man/Makefile.am
- build-aux/m4/bitcoin_find_bdb48.m4
- autogen.sh
- Makefile.am
- src/secp256k1/build-aux/m4/ax_prog_cc_for_build.m4
- build-aux/m4/ax_boost_base.m4
- src/secp256k1/build-aux/m4/m4_ax_cxx_compile_stdcxx.m4
- build-aux/m4/l_atomic.m4
- src/secp256k1/build-aux/m4/bitcoin_secp.m4
- build-aux/m4/ax_boost_filesystem.m4
- build-aux/m4/ax_boost_thread.m4
- build-aux/m4/ax_pthread.m4
- src/univalue/build-aux/m4/.gitignore
- build-aux/m4/ax_boost_program_options.m4
- build-aux/m4/ax_check_preproc_flag.m4
- build-aux/m4/ax_check_compile_flag.m4
- libbitcoinconsensus.pc.in
- src/univalue/Makefile.am
- src/secp256k1/Makefile.am
- build-aux/m4/ax_gcc_func_attribute.m4
- src/univalue/autogen.sh
- build-aux/m4/ax_boost_system.m4
- src/secp256k1/build-aux/m4/ax_jni_include_dir.m4
- build-aux/m4/bitcoin_qt.m4
- src/Makefile.am
- configure.ac
- build-aux/m4/bitcoin_subdir_to_include.m4
- build-aux/m4/ax_boost_unit_test_framework.m4
- src/secp256k1/configure.ac
🚧 Files skipped from review as they are similar to previous changes (3)
- Dockerfile
- doc/build-unix.md
- README.md
🧰 Additional context used
🧬 Code graph analysis (7)
qa/rpc-tests/spendspark.py (2)
qa/rpc-tests/test_framework/test_framework.py (2)
BitcoinTestFramework(50-259)sync_all(111-119)qa/rpc-tests/test_framework/util.py (1)
enable_mocktime(56-61)
qa/rpc-tests/sendtoaddress.py (1)
qa/rpc-tests/test_framework/test_framework.py (1)
BitcoinTestFramework(50-259)
qa/rpc-tests/sendmany.py (1)
qa/rpc-tests/test_framework/test_framework.py (1)
BitcoinTestFramework(50-259)
qa/rpc-tests/sendspark.py (1)
qa/rpc-tests/test_framework/test_framework.py (2)
BitcoinTestFramework(50-259)sync_all(111-119)
src/wallet/rpcwallet.cpp (6)
src/wallet/wallet.cpp (4)
GetMinimumFee(5185-5189)GetMinimumFee(5185-5185)GetMinimumFee(5191-5209)GetMinimumFee(5191-5191)src/validation.cpp (1)
mempool(123-123)src/rpc/server.cpp (4)
AmountFromValue(123-133)AmountFromValue(123-123)ValueFromAmount(135-143)ValueFromAmount(135-135)src/sparkname.h (1)
GetInstance(185-185)src/spark/state.cpp (6)
GetNetworkType(93-102)GetNetworkType(93-93)IsSparkAllowed(82-86)IsSparkAllowed(82-82)IsSparkAllowed(88-91)IsSparkAllowed(88-88)src/script/standard.cpp (2)
GetScriptForDestination(369-375)GetScriptForDestination(369-369)
qa/rpc-tests/sendtransparent.py (1)
qa/rpc-tests/test_framework/test_framework.py (1)
BitcoinTestFramework(50-259)
qa/rpc-tests/sendsparkmany.py (1)
qa/rpc-tests/test_framework/test_framework.py (2)
BitcoinTestFramework(50-259)sync_all(111-119)
🪛 Ruff (0.14.1)
qa/rpc-tests/spendspark.py
37-37: Within an except clause, raise exceptions with raise ... from err or raise ... from None to distinguish them from errors in exception handling
(B904)
37-37: Avoid specifying long messages outside the exception class
(TRY003)
qa/rpc-tests/sendtoaddress.py
37-37: Within an except clause, raise exceptions with raise ... from err or raise ... from None to distinguish them from errors in exception handling
(B904)
37-37: Avoid specifying long messages outside the exception class
(TRY003)
qa/rpc-tests/sendmany.py
34-34: Within an except clause, raise exceptions with raise ... from err or raise ... from None to distinguish them from errors in exception handling
(B904)
34-34: Avoid specifying long messages outside the exception class
(TRY003)
qa/rpc-tests/sendspark.py
37-37: Within an except clause, raise exceptions with raise ... from err or raise ... from None to distinguish them from errors in exception handling
(B904)
37-37: Avoid specifying long messages outside the exception class
(TRY003)
qa/rpc-tests/sendtransparent.py
34-34: Within an except clause, raise exceptions with raise ... from err or raise ... from None to distinguish them from errors in exception handling
(B904)
34-34: Avoid specifying long messages outside the exception class
(TRY003)
qa/rpc-tests/sendsparkmany.py
37-37: Within an except clause, raise exceptions with raise ... from err or raise ... from None to distinguish them from errors in exception handling
(B904)
37-37: Avoid specifying long messages outside the exception class
(TRY003)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (9)
- GitHub Check: build-guix-arm64-apple-darwin
- GitHub Check: build-guix-x86_64-apple-darwin
- GitHub Check: build-guix-x86_64-w64-mingw32
- GitHub Check: build-mac-cmake
- GitHub Check: build-guix-aarch64-linux-gnu
- GitHub Check: build-guix-x86_64-linux-gnu
- GitHub Check: build-linux-cmake
- GitHub Check: build-windows-cmake
- GitHub Check: build
🔇 Additional comments (4)
src/wallet/rpcwallet.cpp (1)
683-686: Review comment is incorrect. The enumCoinTypedoes not haveTRANSPARENT_ONLYorONLY_TRANSPARENTvalues. The available values are:ALL_COINS,ONLY_DENOMINATED,ONLY_NOT1000IFMN,ONLY_NONDENOMINATED_NOT1000IFMN,ONLY_1000,ONLY_PRIVATESEND_COLLATERAL,ONLY_MINTS, andWITH_MINTS.
ALL_COINSis semantically correct for Spark minting because it excludes all mint types (Zerocoin, Sigma, Lelantus, Spark), leaving only transparent funds—exactly what the PR requires. The code at both locations uses the appropriate value.Likely an incorrect or invalid review comment.
qa/rpc-tests/sendtoaddress.py (1)
182-184: Confirm expected order of concatenated “to” (comment_to) values.The assertion expects reversed order. Is this intentional given current wallet behavior? If not, swap order or update wallet logic/tests.
qa/pull-tester/rpc-tests.py (1)
193-200: Added Spark/Unified send RPC tests to default suite — good coverage expansion.Please ensure CI cache accommodates extra blocks generated by new tests; if runtime balloons, consider gating heavy Spark tests under -extended.
qa/rpc-tests/sendtransparent.py (1)
133-136: Confirm concatenated comment behavior matches node.Tests assume tx['comment'] and tx['to'] aggregate per‑output fields joined by '; '. Verify the RPC implements this; otherwise gate or adapt tests.
Also applies to: 156-158, 206-207, 249-250
| def activate_spark(self): | ||
| HARDCODED_ACTIVATION_HEIGHT = 401 | ||
| current_height = self.nodes[0].getblockcount() | ||
| if current_height < HARDCODED_ACTIVATION_HEIGHT: | ||
| self.nodes[0].generate(HARDCODED_ACTIVATION_HEIGHT - current_height) | ||
| self.log.info(f"Generated {HARDCODED_ACTIVATION_HEIGHT - current_height} blocks to activate Spark.") | ||
|
|
||
| blockchain_info = self.nodes[0].getblockchaininfo() | ||
| activation_height = HARDCODED_ACTIVATION_HEIGHT | ||
| if 'upgrades' in blockchain_info and 'spark' in blockchain_info['upgrades']: | ||
| activation_height = blockchain_info['upgrades']['spark']['activationheight'] | ||
| self.log.info(f"Dynamic Spark activation height: {activation_height}") | ||
|
|
||
| if current_height < activation_height: | ||
| self.nodes[0].generate(activation_height - current_height) | ||
| self.log.info(f"Generated {activation_height - current_height} blocks to meet dynamic activation height.") | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion | 🟠 Major
Fix activation height overshoot (use fresh height each time).
Same stale height issue as in sendtransparent.py.
- def activate_spark(self):
- HARDCODED_ACTIVATION_HEIGHT = 401
- current_height = self.nodes[0].getblockcount()
- if current_height < HARDCODED_ACTIVATION_HEIGHT:
- self.nodes[0].generate(HARDCODED_ACTIVATION_HEIGHT - current_height)
- self.log.info(f"Generated {HARDCODED_ACTIVATION_HEIGHT - current_height} blocks to activate Spark.")
+ def activate_spark(self):
+ HARDCODED_ACTIVATION_HEIGHT = 401
+ height = self.nodes[0].getblockcount()
+ if height < HARDCODED_ACTIVATION_HEIGHT:
+ self.nodes[0].generate(HARDCODED_ACTIVATION_HEIGHT - height)
+ self.log.info(f"Generated {HARDCODED_ACTIVATION_HEIGHT - height} blocks to activate Spark.")
+ height = self.nodes[0].getblockcount()
@@
- if current_height < activation_height:
- self.nodes[0].generate(activation_height - current_height)
- self.log.info(f"Generated {activation_height - current_height} blocks to meet dynamic activation height.")
+ if height < activation_height:
+ self.nodes[0].generate(activation_height - height)
+ self.log.info(f"Generated {activation_height - height} blocks to meet dynamic activation height.")📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| def activate_spark(self): | |
| HARDCODED_ACTIVATION_HEIGHT = 401 | |
| current_height = self.nodes[0].getblockcount() | |
| if current_height < HARDCODED_ACTIVATION_HEIGHT: | |
| self.nodes[0].generate(HARDCODED_ACTIVATION_HEIGHT - current_height) | |
| self.log.info(f"Generated {HARDCODED_ACTIVATION_HEIGHT - current_height} blocks to activate Spark.") | |
| blockchain_info = self.nodes[0].getblockchaininfo() | |
| activation_height = HARDCODED_ACTIVATION_HEIGHT | |
| if 'upgrades' in blockchain_info and 'spark' in blockchain_info['upgrades']: | |
| activation_height = blockchain_info['upgrades']['spark']['activationheight'] | |
| self.log.info(f"Dynamic Spark activation height: {activation_height}") | |
| if current_height < activation_height: | |
| self.nodes[0].generate(activation_height - current_height) | |
| self.log.info(f"Generated {activation_height - current_height} blocks to meet dynamic activation height.") | |
| def activate_spark(self): | |
| HARDCODED_ACTIVATION_HEIGHT = 401 | |
| height = self.nodes[0].getblockcount() | |
| if height < HARDCODED_ACTIVATION_HEIGHT: | |
| self.nodes[0].generate(HARDCODED_ACTIVATION_HEIGHT - height) | |
| self.log.info(f"Generated {HARDCODED_ACTIVATION_HEIGHT - height} blocks to activate Spark.") | |
| height = self.nodes[0].getblockcount() | |
| blockchain_info = self.nodes[0].getblockchaininfo() | |
| activation_height = HARDCODED_ACTIVATION_HEIGHT | |
| if 'upgrades' in blockchain_info and 'spark' in blockchain_info['upgrades']: | |
| activation_height = blockchain_info['upgrades']['spark']['activationheight'] | |
| self.log.info(f"Dynamic Spark activation height: {activation_height}") | |
| if height < activation_height: | |
| self.nodes[0].generate(activation_height - height) | |
| self.log.info(f"Generated {activation_height - height} blocks to meet dynamic activation height.") |
🤖 Prompt for AI Agents
In qa/rpc-tests/sendmany.py around lines 12 to 28, the code uses the initial
current_height for both the hardcoded activation generation and the later
comparison against activation_height, which can cause overshooting; recalculate
current_height after any generate() call (i.e., fetch getblockcount() again
before computing and generating blocks for the dynamic activation_height) so the
second generate() uses the fresh height and only generates the exact number of
blocks needed.
| def activate_spark(self): | ||
| HARDCODED_ACTIVATION_HEIGHT = 401 | ||
| current_height = self.nodes[0].getblockcount() | ||
| if current_height < HARDCODED_ACTIVATION_HEIGHT: | ||
| self.nodes[0].generate(HARDCODED_ACTIVATION_HEIGHT - current_height) | ||
| self.log.info(f"Generated {HARDCODED_ACTIVATION_HEIGHT - current_height} blocks to activate Spark.") | ||
|
|
||
| blockchain_info = self.nodes[0].getblockchaininfo() | ||
| activation_height = HARDCODED_ACTIVATION_HEIGHT | ||
| if 'upgrades' in blockchain_info and 'spark' in blockchain_info['upgrades']: | ||
| activation_height = blockchain_info['upgrades']['spark']['activationheight'] | ||
| self.log.info(f"Dynamic Spark activation height: {activation_height}") | ||
|
|
||
| if current_height < activation_height: | ||
| self.nodes[0].generate(activation_height - current_height) | ||
| self.log.info(f"Generated {activation_height - current_height} blocks to meet dynamic activation height.") | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion | 🟠 Major
Fix activation height overshoot (use fresh height each time).
current_height is not refreshed after the first generate; the second generate can overshoot. Recompute height before comparing to activation_height.
Apply:
def activate_spark(self):
HARDCODED_ACTIVATION_HEIGHT = 401
- current_height = self.nodes[0].getblockcount()
- if current_height < HARDCODED_ACTIVATION_HEIGHT:
- self.nodes[0].generate(HARDCODED_ACTIVATION_HEIGHT - current_height)
- self.log.info(f"Generated {HARDCODED_ACTIVATION_HEIGHT - current_height} blocks to activate Spark.")
+ height = self.nodes[0].getblockcount()
+ if height < HARDCODED_ACTIVATION_HEIGHT:
+ self.nodes[0].generate(HARDCODED_ACTIVATION_HEIGHT - height)
+ self.log.info(f"Generated {HARDCODED_ACTIVATION_HEIGHT - height} blocks to activate Spark.")
+ height = self.nodes[0].getblockcount()
@@
- if current_height < activation_height:
- self.nodes[0].generate(activation_height - current_height)
- self.log.info(f"Generated {activation_height - current_height} blocks to meet dynamic activation height.")
+ if height < activation_height:
+ self.nodes[0].generate(activation_height - height)
+ self.log.info(f"Generated {activation_height - height} blocks to meet dynamic activation height.")🤖 Prompt for AI Agents
In qa/rpc-tests/sendtransparent.py around lines 12 to 28, current_height is read
once and reused after the first generate, which can cause the second generate to
overshoot; refresh current_height by calling self.nodes[0].getblockcount() again
before comparing to activation_height and computing how many blocks to generate
so the second generate uses the up-to-date height; update the log messages
accordingly.
| CValidationState state; | ||
| CReserveKey reservekey(pwallet); | ||
| CWalletTx& wtx = wtxAndFee[0].first; | ||
|
|
||
| LogPrintf("Committing Spark mint transaction with %zu inputs and %zu outputs\n", | ||
| wtx.tx->vin.size(), wtx.tx->vout.size()); | ||
|
|
||
| if (!pwallet->CommitTransaction(wtx, reservekey, g_connman.get(), state)) { | ||
| LogPrintf("Failed to commit transaction: %s\n", state.GetRejectReason()); | ||
| throw JSONRPCError(RPC_WALLET_ERROR, | ||
| std::string("Error committing transaction: ") + state.GetRejectReason()); | ||
| } | ||
|
|
||
| return wtx.GetHash().GetHex(); | ||
| } catch (const std::exception &e) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Simple Spark send commits only the first tx; risk of silently dropping additional txs.
If CreateSparkMintTransactions returns multiple txs, commit all; return the first txid to preserve the simple API.
Apply this diff:
- // Commit the transaction
- CValidationState state;
- CReserveKey reservekey(pwallet);
- CWalletTx& wtx = wtxAndFee[0].first;
-
- LogPrintf("Committing Spark mint transaction with %zu inputs and %zu outputs\n",
- wtx.tx->vin.size(), wtx.tx->vout.size());
-
- if (!pwallet->CommitTransaction(wtx, reservekey, g_connman.get(), state)) {
- LogPrintf("Failed to commit transaction: %s\n", state.GetRejectReason());
- throw JSONRPCError(RPC_WALLET_ERROR,
- std::string("Error committing transaction: ") + state.GetRejectReason());
- }
-
- return wtx.GetHash().GetHex();
+ // Commit all transactions, return the first txid
+ std::string firstTxid;
+ for (auto& pair : wtxAndFee) {
+ CWalletTx& tw = pair.first;
+ CValidationState state;
+ CReserveKey reservekey(pwallet);
+ LogPrintf("Committing Spark mint transaction with %zu inputs and %zu outputs\n",
+ tw.tx->vin.size(), tw.tx->vout.size());
+ if (!pwallet->CommitTransaction(tw, reservekey, g_connman.get(), state)) {
+ LogPrintf("Failed to commit transaction: %s\n", state.GetRejectReason());
+ throw JSONRPCError(RPC_WALLET_ERROR,
+ std::string("Error committing transaction: ") + state.GetRejectReason());
+ }
+ if (firstTxid.empty()) firstTxid = tw.GetHash().GetHex();
+ }
+ return firstTxid;| // Validate that amount is reasonable when fee is being subtracted | ||
| ValidateFeeSubtractionAmount(nAmount, subtractFee, name_); | ||
| LogPrintf("rpcWallet.mintSpark() nAmount = %d \n", nAmount); | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fix log label typo in spendspark.
Log says mintSpark; should be spendspark.
- LogPrintf("rpcWallet.mintSpark() nAmount = %d \n", nAmount);
+ LogPrintf("rpcWallet.spendspark() nAmount = %d \n", nAmount);🤖 Prompt for AI Agents
In src/wallet/rpcwallet.cpp around lines 4526 to 4529, the LogPrintf call prints
"rpcWallet.mintSpark()" but this is a typo — it should read
"rpcWallet.spendSpark()" to match the spendspark operation; update the log label
accordingly (keep the existing amount variable and formatting as-is) so the log
accurately reflects the spendspark action.
311f8dc to
0c948f1
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 4
♻️ Duplicate comments (12)
qa/rpc-tests/sendtransparent.py (1)
12-30: Fix Spark activation overshoot
current_heightis captured once and reused after you already generated blocks, so the second branch can request hundreds of extra blocks. Recompute height (or derive a singletarget = max(hardcoded, dynamic)and generate against the fresh height) before deciding whether further generation is needed.qa/rpc-tests/sendmany.py (2)
12-30: Fix Spark activation overshoot
Same stale-height problem here: the second generate call uses the pre-generationcurrent_height, causing unnecessary extra blocks. Refresh the height (or compute a single target height) before comparing.
296-315: Normalize txids and sum fees per transaction
sendmanystill returns a single txid string for the generated transaction. Iterating that string yields characters rather than txids, anddetails_*[0]['fee']will KeyError because the fee sits attx['fee']. Please normalize to a list if needed and accumulate fees from eachtx['fee']. Suggested fix:- txs = [] - for txid in txids: - txs.append(self.nodes[0].gettransaction(txid)) + if isinstance(txids, str): + txids = [txids] + txs = [self.nodes[0].gettransaction(txid) for txid in txids] details_transparent = [] details_spark = [] + total_fee = Decimal('0') for tx in txs: + total_fee += Decimal(tx['fee']) details_transparent.extend([detail for detail in tx['details'] if detail['category'] == 'send']) details_spark.extend([detail for detail in tx['details'] if detail['category'] == 'mint']) - total_amount_spark = sum([detail['amount'] for detail in details_spark]) + details_spark[0]['fee'] - total_amount_transparent = sum([detail['amount'] for detail in details_transparent]) + details_transparent[0]['fee'] - total_amount = total_amount_spark + total_amount_transparent + total_amount = sum(detail['amount'] for detail in details_spark + details_transparent) + total_fee self.assert_equal_float(-total_amount, sum(expected_amounts))qa/rpc-tests/sendtoaddress.py (2)
12-30: Fix Spark activation overshoot
Same issue here:current_heightis reused after generating blocks, so the follow-up generate can overshoot significantly. Refresh the height (or compute a single target) before acting.
458-492: Normalize txids and sum fees per transaction
Whensendtoaddresssplits across destination types it still returns a string txid per transaction. Iterating the string and trying to readdetail['fee']will break—fees live attx['fee']. Please normalize and accumulate the fees at the transaction level, similar to:- txs = [] - for txid in txids: - txs.append(self.nodes[0].gettransaction(txid)) + if isinstance(txids, str): + txids = [txids] + txs = [self.nodes[0].gettransaction(txid) for txid in txids] details_transparent = [] details_spark = [] + total_fee = Decimal('0') for tx in txs: + total_fee += Decimal(tx['fee']) details_transparent.extend([detail for detail in tx['details'] if detail['category'] == 'send']) details_spark.extend([detail for detail in tx['details'] if detail['category'] == 'mint']) - total_amount_spark = sum([detail['amount'] for detail in details_spark]) + details_spark[0]['fee'] - total_amount_transparent = sum([detail['amount'] for detail in details_transparent]) + details_transparent[0]['fee'] - total_amount = total_amount_spark + total_amount_transparent + total_amount = sum(detail['amount'] for detail in details_spark + details_transparent) + total_fee self.assert_equal_float(-total_amount, total_expected_amounts)qa/rpc-tests/sendspark.py (1)
12-30: Fix Spark activation overshoot
The stalecurrent_heightreuse persists here as well. After the initial generate you need to re-read the height (or pre-compute a single target). Otherwise the second branch over-generates.qa/rpc-tests/spendspark.py (1)
12-30: Fix Spark activation overshoot
Please refreshcurrent_height(or compute a single target height) before the second generate—reusing the old value causes unnecessary extra block production.qa/rpc-tests/sendsparkmany.py (1)
12-30: Fix Spark activation overshoot
Same stale-height issue: reusingcurrent_heightafter you already generated blocks makes the fallback branch overshoot. Update the height (or compute a single target) before generating again.src/wallet/rpcwallet.cpp (4)
912-918: Remove Spark-balance check in sendtoaddress JSON path (uses transparent funds).Spark recipients here mint from transparent balance; checking pwallet->sparkWallet balance wrongly rejects valid sends.
- // Spark checking - if (!sparkRecipients.empty()) { - EnsureSparkWalletIsAvailable(); - CAmount sparkBalance = pwallet->sparkWallet->getAvailableBalance(); - if (sparkBalance < totalSparkAmount) { - throw JSONRPCError(RPC_WALLET_INSUFFICIENT_FUNDS, "Insufficient Spark funds"); - } - } + // Spark recipients mint from transparent funds; no Spark-balance check here.
716-731: Commit all Spark mint txs (simple format) instead of only the first.Multiple wtxs can be produced; commit each and return the first txid.
- // Commit the transaction - CValidationState state; - CReserveKey reservekey(pwallet); - CWalletTx& wtx = wtxAndFee[0].first; - LogPrintf("Committing Spark mint transaction with %zu inputs and %zu outputs\n", - wtx.tx->vin.size(), wtx.tx->vout.size()); - if (!pwallet->CommitTransaction(wtx, reservekey, g_connman.get(), state)) { - LogPrintf("Failed to commit transaction: %s\n", state.GetRejectReason()); - throw JSONRPCError(RPC_WALLET_ERROR, - std::string("Error committing transaction: ") + state.GetRejectReason()); - } - return wtx.GetHash().GetHex(); + // Commit all transactions; return the first txid for compatibility + std::string firstTxid; + for (auto& pf : wtxAndFee) { + CWalletTx& tw = pf.first; + CValidationState state; + CReserveKey reservekey(pwallet); + LogPrintf("Committing Spark mint transaction with %zu inputs and %zu outputs\n", + tw.tx->vin.size(), tw.tx->vout.size()); + if (!pwallet->CommitTransaction(tw, reservekey, g_connman.get(), state)) { + LogPrintf("Failed to commit transaction: %s\n", state.GetRejectReason()); + throw JSONRPCError(RPC_WALLET_ERROR, + std::string("Error committing transaction: ") + state.GetRejectReason()); + } + if (firstTxid.empty()) firstTxid = tw.GetHash().GetHex(); + } + return firstTxid;
1007-1011: Fee undercount when multiple dummy mint txs returned. Sum all fees.Only using dummyWtxAndFee[0].second underestimates total; distribute based on the sum.
- CAmount totalEstimatedFee = dummyWtxAndFee[0].second; + CAmount totalEstimatedFee = 0; + for (const auto& p : dummyWtxAndFee) { + totalEstimatedFee += p.second; + }
4526-4529: Incorrect log label in spendspark.Log says mintSpark; should be spendspark.
- LogPrintf("rpcWallet.mintSpark() nAmount = %d \n", nAmount); + LogPrintf("rpcWallet.spendspark() nAmount = %d \n", nAmount);
🧹 Nitpick comments (3)
src/wallet/rpcwallet.cpp (3)
1199-1201: Return a plain string for single txid.Constructing UniValue from string directly is inconsistent; return the string (RPC layer wraps appropriately).
- return UniValue(txids[0]); + return txids[0];
4523-4527: Minor: duplicate amount validation.nAmount <= 0 is checked twice (Lines 4510 and 4523). Remove one.
- if (nAmount <= 0) - throw JSONRPCError(RPC_TYPE_ERROR, "Invalid amount for send");
4743-4753: Detect Spark addresses by decoding, not by “st” prefix.Use spark::Address.decode (and Spark name '@') to decide when to attach memo.
- // Check if this is a Spark address (starts with 'st' or '@') - bool isSparkAddress = false; - if (name_.length() >= 2 && name_.substr(0, 2) == "st") { - isSparkAddress = true; - } else if (!name_.empty() && name_[0] == '@') { - isSparkAddress = true; // Spark name - } + bool isSparkAddress = false; + if (!name_.empty() && name_[0] == '@') { + isSparkAddress = true; // Spark name + } else { + try { + spark::Address tmp(spark::Params::get_default()); + (void)tmp.decode(name_); + isSparkAddress = true; + } catch (const std::exception&) {} + }
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
.github/workflows/ci-master.ymlis excluded by!**/*.yml
📒 Files selected for processing (9)
README.md(1 hunks)qa/pull-tester/rpc-tests.py(1 hunks)qa/rpc-tests/sendmany.py(1 hunks)qa/rpc-tests/sendspark.py(1 hunks)qa/rpc-tests/sendsparkmany.py(1 hunks)qa/rpc-tests/sendtoaddress.py(1 hunks)qa/rpc-tests/sendtransparent.py(1 hunks)qa/rpc-tests/spendspark.py(1 hunks)src/wallet/rpcwallet.cpp(13 hunks)
🧰 Additional context used
🧠 Learnings (1)
📚 Learning: 2025-03-31T04:27:35.223Z
Learnt from: aleflm
Repo: firoorg/firo PR: 1504
File: CMakeLists.txt:457-484
Timestamp: 2025-03-31T04:27:35.223Z
Learning: Pkg-config should not be used in the Firo CMake build system as it's not available by default on Windows, and the project aims to support Windows natively in the future.
Applied to files:
README.md
🧬 Code graph analysis (7)
qa/rpc-tests/sendtoaddress.py (1)
qa/rpc-tests/test_framework/test_framework.py (1)
BitcoinTestFramework(50-259)
qa/rpc-tests/spendspark.py (2)
qa/rpc-tests/test_framework/test_framework.py (2)
BitcoinTestFramework(50-259)sync_all(111-119)qa/rpc-tests/test_framework/util.py (1)
enable_mocktime(56-61)
src/wallet/rpcwallet.cpp (7)
src/wallet/wallet.cpp (4)
GetMinimumFee(5185-5189)GetMinimumFee(5185-5185)GetMinimumFee(5191-5209)GetMinimumFee(5191-5191)src/validation.cpp (1)
mempool(123-123)src/wallet/rpcdump.cpp (1)
EnsureWalletIsAvailable(35-35)src/rpc/server.cpp (4)
AmountFromValue(123-133)AmountFromValue(123-123)ValueFromAmount(135-143)ValueFromAmount(135-135)src/sparkname.h (1)
GetInstance(185-185)src/spark/state.cpp (6)
GetNetworkType(93-102)GetNetworkType(93-93)IsSparkAllowed(82-86)IsSparkAllowed(82-82)IsSparkAllowed(88-91)IsSparkAllowed(88-88)src/script/standard.cpp (2)
GetScriptForDestination(369-375)GetScriptForDestination(369-369)
qa/rpc-tests/sendtransparent.py (1)
qa/rpc-tests/test_framework/test_framework.py (1)
BitcoinTestFramework(50-259)
qa/rpc-tests/sendsparkmany.py (1)
qa/rpc-tests/test_framework/test_framework.py (2)
BitcoinTestFramework(50-259)sync_all(111-119)
qa/rpc-tests/sendspark.py (1)
qa/rpc-tests/test_framework/test_framework.py (2)
BitcoinTestFramework(50-259)sync_all(111-119)
qa/rpc-tests/sendmany.py (1)
qa/rpc-tests/test_framework/test_framework.py (1)
BitcoinTestFramework(50-259)
🪛 LanguageTool
README.md
[duplication] ~255-~255: Possible typo: you repeated a word.
Context: ...can generate installation packages with cmake --build build --target package. Once you run this co...
(ENGLISH_WORD_REPEAT_RULE)
🪛 Ruff (0.14.4)
qa/rpc-tests/sendtoaddress.py
37-37: Within an except clause, raise exceptions with raise ... from err or raise ... from None to distinguish them from errors in exception handling
(B904)
37-37: Avoid specifying long messages outside the exception class
(TRY003)
qa/rpc-tests/spendspark.py
37-37: Within an except clause, raise exceptions with raise ... from err or raise ... from None to distinguish them from errors in exception handling
(B904)
37-37: Avoid specifying long messages outside the exception class
(TRY003)
qa/rpc-tests/sendtransparent.py
34-34: Within an except clause, raise exceptions with raise ... from err or raise ... from None to distinguish them from errors in exception handling
(B904)
34-34: Avoid specifying long messages outside the exception class
(TRY003)
qa/rpc-tests/sendsparkmany.py
37-37: Within an except clause, raise exceptions with raise ... from err or raise ... from None to distinguish them from errors in exception handling
(B904)
37-37: Avoid specifying long messages outside the exception class
(TRY003)
qa/rpc-tests/sendspark.py
37-37: Within an except clause, raise exceptions with raise ... from err or raise ... from None to distinguish them from errors in exception handling
(B904)
37-37: Avoid specifying long messages outside the exception class
(TRY003)
qa/rpc-tests/sendmany.py
34-34: Within an except clause, raise exceptions with raise ... from err or raise ... from None to distinguish them from errors in exception handling
(B904)
34-34: Avoid specifying long messages outside the exception class
(TRY003)
🔇 Additional comments (3)
README.md (1)
255-260: Updated packaging blurb looks good
Thanks for clarifying the CMake packaging flow; the instructions read cleanly.qa/pull-tester/rpc-tests.py (1)
195-200: New test entries acknowledged
Happy to see the new Spark/transparent coverage wired into the harness.src/wallet/rpcwallet.cpp (1)
684-686: Rewritten Review Comment:No action required—CoinType::ALL_COINS is the correct choice.
The review comment assumes a transparent-only CoinType exists. It does not. The CoinType enum contains: ALL_COINS, ONLY_DENOMINATED, ONLY_NOT1000IFMN, ONLY_NONDENOMINATED_NOT1000IFMN, ONLY_1000, ONLY_PRIVATESEND_COLLATERAL, ONLY_MINTS, and WITH_MINTS.
ALL_COINS already filters to transparent-only coins by design—it explicitly excludes all mint types (Zerocoin, Sigma, Lelantus, LelantusJ, Spark, SparkS). Additionally, line 685 is part of Spark mint transaction creation (transparent → Spark), not a regular sendtoaddress call, making ALL_COINS the semantically correct choice.
Likely an incorrect or invalid review comment.
| static void ValidateFeeSubtractionAmount(CAmount nAmount, bool fSubtractFeeFromAmount, const std::string& strAddress) | ||
| { | ||
| if (fSubtractFeeFromAmount) { | ||
| // Estimate typical transaction size for a simple send (1 input, 2 outputs) | ||
| // Typical sizes: 1 input ~150 bytes, 2 outputs ~68 bytes, overhead ~10 bytes | ||
| unsigned int estimatedTxSize = 230; // Conservative estimate for simple transaction | ||
|
|
||
| // Get estimated fee using wallet's fee calculation with default confirmation target | ||
| CAmount estimatedFee = CWallet::GetMinimumFee(estimatedTxSize, 6, mempool); // 6 blocks = ~1 hour | ||
|
|
||
| if (nAmount <= estimatedFee) { | ||
| throw JSONRPCError(RPC_TYPE_ERROR, strprintf("Amount %s is too small when subtracting fee from address %s. Estimated fee is %s, so amount must be greater than the fee.", FormatMoney(nAmount), strAddress, FormatMoney(estimatedFee))); | ||
| } | ||
| } | ||
| } | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fix: non-static call to CWallet::GetMinimumFee (compile error) and make precheck wallet-aware.
ValidateFeeSubtractionAmount calls a non-static member as if static. Pass pwallet and use pwallet->GetMinimumFee. Apply:
-static void ValidateFeeSubtractionAmount(CAmount nAmount, bool fSubtractFeeFromAmount, const std::string& strAddress)
+static void ValidateFeeSubtractionAmount(CWallet* pwallet, CAmount nAmount, bool fSubtractFeeFromAmount, const std::string& strAddress)
{
if (fSubtractFeeFromAmount) {
// Estimate typical transaction size for a simple send (1 input, 2 outputs)
unsigned int estimatedTxSize = 230; // Conservative estimate for simple transaction
- // Get estimated fee using wallet's fee calculation with default confirmation target
- CAmount estimatedFee = CWallet::GetMinimumFee(estimatedTxSize, 6, mempool); // 6 blocks = ~1 hour
+ if (!pwallet) return;
+ // Use wallet's fee calculation with default confirmation target
+ CAmount estimatedFee = pwallet->GetMinimumFee(estimatedTxSize, 6, mempool); // 6 blocks ≈ 1 hour
if (nAmount <= estimatedFee) {
throw JSONRPCError(RPC_TYPE_ERROR, strprintf("Amount %s is too small when subtracting fee from address %s. Estimated fee is %s, so amount must be greater than the fee.", FormatMoney(nAmount), strAddress, FormatMoney(estimatedFee)));
}
}
}And update call sites:
-ValidateFeeSubtractionAmount(nAmount, fSubtractFeeFromAmount, strAddress);
+ValidateFeeSubtractionAmount(pwallet, nAmount, fSubtractFeeFromAmount, strAddress);
-ValidateFeeSubtractionAmount(nAmount, subtractFee, strAddress);
+ValidateFeeSubtractionAmount(pwallet, nAmount, subtractFee, strAddress);
-ValidateFeeSubtractionAmount(nAmount, fSubtractFeeFromAmount, name_);
+ValidateFeeSubtractionAmount(pwallet, nAmount, fSubtractFeeFromAmount, name_);
-ValidateFeeSubtractionAmount(nAmount, subtractFee, name_);
+ValidateFeeSubtractionAmount(pwallet, nAmount, subtractFee, name_);Also applies to: 623-625, 813-815, 1933-1935, 4526-4528, 4562-4564, 4732-4734
🤖 Prompt for AI Agents
In src/wallet/rpcwallet.cpp around lines 518 to 533,
ValidateFeeSubtractionAmount currently calls CWallet::GetMinimumFee as if it
were static and is not wallet-aware; change the function to accept a CWallet*
pwallet parameter and replace the call with pwallet->GetMinimumFee(...),
ensuring you pass the wallet instance and any required mempool or flags; update
all call sites listed (around lines 623-625, 813-815, 1933-1935, 4526-4528,
4562-4564, 4732-4734) to pass the appropriate pwallet pointer from their
context; ensure compilation by including necessary headers/forward declarations
and keep behavior identical otherwise.
| // Validate that amount is reasonable when fee is being subtracted | ||
| ValidateFeeSubtractionAmount(nAmount, fSubtractFeeFromAmount, name_); | ||
|
|
||
| // Create JSON object for this address in sendtoaddress format | ||
| UniValue addressParams(UniValue::VOBJ); | ||
| addressParams.push_back(Pair("amount", ValueFromAmount(nAmount))); | ||
| addressParams.push_back(Pair("subtractFee", fSubtractFeeFromAmount)); | ||
|
|
||
| // Add comment only to the first address to avoid duplication | ||
| if (isFirstAddress && !wtx.mapValue["comment"].empty()) { | ||
| addressParams.push_back(Pair("comment", wtx.mapValue["comment"])); | ||
| isFirstAddress = false; | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion | 🟠 Major
sendmany → sendtoaddress conversion: update helper call after signature change.
After making ValidateFeeSubtractionAmount wallet-aware, pass pwallet.
- ValidateFeeSubtractionAmount(nAmount, fSubtractFeeFromAmount, name_);
+ ValidateFeeSubtractionAmount(pwallet, nAmount, fSubtractFeeFromAmount, name_);📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| // Validate that amount is reasonable when fee is being subtracted | |
| ValidateFeeSubtractionAmount(nAmount, fSubtractFeeFromAmount, name_); | |
| // Create JSON object for this address in sendtoaddress format | |
| UniValue addressParams(UniValue::VOBJ); | |
| addressParams.push_back(Pair("amount", ValueFromAmount(nAmount))); | |
| addressParams.push_back(Pair("subtractFee", fSubtractFeeFromAmount)); | |
| // Add comment only to the first address to avoid duplication | |
| if (isFirstAddress && !wtx.mapValue["comment"].empty()) { | |
| addressParams.push_back(Pair("comment", wtx.mapValue["comment"])); | |
| isFirstAddress = false; | |
| } | |
| // Validate that amount is reasonable when fee is being subtracted | |
| ValidateFeeSubtractionAmount(pwallet, nAmount, fSubtractFeeFromAmount, name_); | |
| // Create JSON object for this address in sendtoaddress format | |
| UniValue addressParams(UniValue::VOBJ); | |
| addressParams.push_back(Pair("amount", ValueFromAmount(nAmount))); | |
| addressParams.push_back(Pair("subtractFee", fSubtractFeeFromAmount)); | |
| // Add comment only to the first address to avoid duplication | |
| if (isFirstAddress && !wtx.mapValue["comment"].empty()) { | |
| addressParams.push_back(Pair("comment", wtx.mapValue["comment"])); | |
| isFirstAddress = false; | |
| } |
🤖 Prompt for AI Agents
In src/wallet/rpcwallet.cpp around lines 1933 to 1945, the call to
ValidateFeeSubtractionAmount(nAmount, fSubtractFeeFromAmount, name_) needs to be
updated to the new signature that requires the wallet pointer; change the
invocation to pass pwallet as the first argument (e.g.
ValidateFeeSubtractionAmount(pwallet, nAmount, fSubtractFeeFromAmount, name_))
so it matches the wallet-aware function definition.
| // Validate that amount is reasonable when fee is being subtracted | ||
| ValidateFeeSubtractionAmount(nAmount, fSubtractFeeFromAmount, name_); | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion | 🟠 Major
spendspark: update helper call after signature change.
- ValidateFeeSubtractionAmount(nAmount, fSubtractFeeFromAmount, name_);
+ ValidateFeeSubtractionAmount(pwallet, nAmount, fSubtractFeeFromAmount, name_);📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| // Validate that amount is reasonable when fee is being subtracted | |
| ValidateFeeSubtractionAmount(nAmount, fSubtractFeeFromAmount, name_); | |
| // Validate that amount is reasonable when fee is being subtracted | |
| ValidateFeeSubtractionAmount(pwallet, nAmount, fSubtractFeeFromAmount, name_); | |
🤖 Prompt for AI Agents
In src/wallet/rpcwallet.cpp around lines 4562-4564, the call to
ValidateFeeSubtractionAmount must be updated to match the helper's new signature
— replace the existing call ValidateFeeSubtractionAmount(nAmount,
fSubtractFeeFromAmount, name_) with the exact parameters and order declared in
the updated helper (and add any required context like wallet pointer, fee rate,
or updated types), and import or adjust any headers/namespaces if needed;
rebuild and fix any resulting compile errors by matching types and argument
order to the new function declaration.
. Add 3 new functions (sendsparkmany, sendtransparent and sendspark). . Update the function to use just transparent balance or private balance.
0c948f1 to
b49a450
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
♻️ Duplicate comments (12)
qa/rpc-tests/sendtransparent.py (1)
12-28: Fix activation height overshoot (use fresh height each time).The stale
current_heightissue causes potential overshooting when generating blocks for dynamic activation height.qa/rpc-tests/sendspark.py (1)
12-30: Fix activation height logic to avoid double-generating blocks.The stale
current_heightis reused after the first generate(), causing potential overshooting.qa/rpc-tests/sendsparkmany.py (1)
12-30: Fix activation height logic (duplicate generate due to stale current_height).The stale
current_heightcauses redundant block generation.qa/rpc-tests/spendspark.py (1)
12-30: Fix activation height logic (stale current_height doubles block generation).The stale
current_heightissue causes potential overshooting.qa/rpc-tests/sendtoaddress.py (2)
12-41: Fix activation height logic and add sync.The stale
current_heightissue causes overshooting, andsync_all()is missing after activation.
482-493: Bug: fee taken from details[] instead of tx.The code incorrectly accesses
details_spark[0]['fee']anddetails_transparent[0]['fee']. Fee is at transaction root level (tx['fee']), not in details array. This will cause a runtime KeyError.qa/rpc-tests/sendmany.py (2)
12-28: Fix activation height overshoot (use fresh height each time).The stale
current_heightcauses incorrect block generation for dynamic activation height.
304-315: Fix mixed transparent+spark case: handle return type and sum fees at tx level.Two critical issues:
- Line 305 iterates over
txidsassuming it's a list, butsendmanymay return a single txid string for some cases- Lines 312-313 incorrectly access fee from details array instead of transaction root level
src/wallet/rpcwallet.cpp (4)
517-531: Fix compile error in ValidateFeeSubtractionAmount.
CWallet::GetMinimumFeeis an instance method, so this helper still fails to compile (or links against the wrong symbol) because it calls it as if it were static. Please pass the wallet pointer intoValidateFeeSubtractionAmount, guard for null, and usepwallet->GetMinimumFee(...), updating every call site (sendtoaddress,sendtransparent,spendspark,sendsparkmany, etc.). Example fix:-static void ValidateFeeSubtractionAmount(CAmount nAmount, bool fSubtractFeeFromAmount, const std::string& strAddress) +static void ValidateFeeSubtractionAmount(CWallet* pwallet, CAmount nAmount, bool fSubtractFeeFromAmount, const std::string& strAddress) { if (fSubtractFeeFromAmount) { unsigned int estimatedTxSize = 230; - CAmount estimatedFee = CWallet::GetMinimumFee(estimatedTxSize, 6, mempool); + if (!pwallet) return; + CAmount estimatedFee = pwallet->GetMinimumFee(estimatedTxSize, 6, mempool); ...Then adjust every caller to pass the local
pwallet. Until this is done the build is broken.
717-728: Commit every Spark mint transaction before returning.
CreateSparkMintTransactionsmay split the mint across several transactions, but this branch only commitswtxAndFee[0]and silently drops the rest. The additional txs never reach the chain and their reserved keys stay locked. Iterate overwtxAndFee, commit each entry, and return the first txid.
912-917: Don’t require Spark balance when minting via transparent funds.In the JSON path Spark recipients are satisfied by minting from transparent coins, yet this check still insists on having matching Spark wallet balance. That wrongly throws “Insufficient Spark funds” whenever the Spark wallet is empty, blocking perfectly valid sends. Please drop this balance check (and leave the later transparent balance validation in place).
1008-1026: Sum fees across all dummy Spark mint transactions.When fee subtraction is enabled you still use only
dummyWtxAndFee[0].second. IfCreateSparkMintTransactionsreturns multiple txs, the total fee is the sum of every.secondentry. Under-counting here means fees aren’t fully subtracted from the requested outputs. Accumulate the fees from the entire vector before computingfeePerOutput/remainingFee.
🧹 Nitpick comments (2)
qa/rpc-tests/sendsparkmany.py (1)
70-71: Consider clarifying test method naming.The method name includes "transparent" but tests Spark addresses. While this might be intentional organizational naming, it's potentially confusing. Consider either renaming methods to remove "transparent" or adding documentation explaining the naming convention.
qa/rpc-tests/spendspark.py (1)
3-3: Remove unused import.
enable_mocktimeis imported but never used in this test file.Apply this diff:
-from test_framework.util import JSONRPCException, enable_mocktime +from test_framework.util import JSONRPCException
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (8)
qa/pull-tester/rpc-tests.py(1 hunks)qa/rpc-tests/sendmany.py(1 hunks)qa/rpc-tests/sendspark.py(1 hunks)qa/rpc-tests/sendsparkmany.py(1 hunks)qa/rpc-tests/sendtoaddress.py(1 hunks)qa/rpc-tests/sendtransparent.py(1 hunks)qa/rpc-tests/spendspark.py(1 hunks)src/wallet/rpcwallet.cpp(13 hunks)
🧰 Additional context used
🧬 Code graph analysis (7)
qa/rpc-tests/sendspark.py (1)
qa/rpc-tests/test_framework/test_framework.py (2)
BitcoinTestFramework(50-259)sync_all(111-119)
qa/rpc-tests/spendspark.py (2)
qa/rpc-tests/test_framework/test_framework.py (2)
BitcoinTestFramework(50-259)sync_all(111-119)qa/rpc-tests/test_framework/util.py (1)
enable_mocktime(56-61)
qa/rpc-tests/sendtransparent.py (1)
qa/rpc-tests/test_framework/test_framework.py (1)
BitcoinTestFramework(50-259)
qa/rpc-tests/sendsparkmany.py (1)
qa/rpc-tests/test_framework/test_framework.py (2)
BitcoinTestFramework(50-259)sync_all(111-119)
src/wallet/rpcwallet.cpp (6)
src/wallet/wallet.cpp (4)
GetMinimumFee(5185-5189)GetMinimumFee(5185-5185)GetMinimumFee(5191-5209)GetMinimumFee(5191-5191)src/validation.cpp (1)
mempool(123-123)src/rpc/server.cpp (4)
AmountFromValue(123-133)AmountFromValue(123-123)ValueFromAmount(135-143)ValueFromAmount(135-135)src/sparkname.h (1)
GetInstance(185-185)src/spark/state.cpp (6)
GetNetworkType(93-102)GetNetworkType(93-93)IsSparkAllowed(82-86)IsSparkAllowed(82-82)IsSparkAllowed(88-91)IsSparkAllowed(88-88)src/script/standard.cpp (2)
GetScriptForDestination(369-375)GetScriptForDestination(369-369)
qa/rpc-tests/sendtoaddress.py (1)
qa/rpc-tests/test_framework/test_framework.py (1)
BitcoinTestFramework(50-259)
qa/rpc-tests/sendmany.py (1)
qa/rpc-tests/test_framework/test_framework.py (1)
BitcoinTestFramework(50-259)
🪛 Ruff (0.14.4)
qa/rpc-tests/sendspark.py
37-37: Within an except clause, raise exceptions with raise ... from err or raise ... from None to distinguish them from errors in exception handling
(B904)
37-37: Avoid specifying long messages outside the exception class
(TRY003)
qa/rpc-tests/spendspark.py
37-37: Within an except clause, raise exceptions with raise ... from err or raise ... from None to distinguish them from errors in exception handling
(B904)
37-37: Avoid specifying long messages outside the exception class
(TRY003)
qa/rpc-tests/sendtransparent.py
34-34: Within an except clause, raise exceptions with raise ... from err or raise ... from None to distinguish them from errors in exception handling
(B904)
34-34: Avoid specifying long messages outside the exception class
(TRY003)
qa/rpc-tests/sendsparkmany.py
37-37: Within an except clause, raise exceptions with raise ... from err or raise ... from None to distinguish them from errors in exception handling
(B904)
37-37: Avoid specifying long messages outside the exception class
(TRY003)
qa/rpc-tests/sendtoaddress.py
37-37: Within an except clause, raise exceptions with raise ... from err or raise ... from None to distinguish them from errors in exception handling
(B904)
37-37: Avoid specifying long messages outside the exception class
(TRY003)
qa/rpc-tests/sendmany.py
34-34: Within an except clause, raise exceptions with raise ... from err or raise ... from None to distinguish them from errors in exception handling
(B904)
34-34: Avoid specifying long messages outside the exception class
(TRY003)
🔇 Additional comments (1)
qa/pull-tester/rpc-tests.py (1)
193-200: LGTM!The test script additions are correctly formatted and follow the established naming conventions. The organization with the blank line separator helps readability.
src/wallet/rpcwallet.cpp
Outdated
| // Validate that amount is reasonable when fee is being subtracted | ||
| ValidateFeeSubtractionAmount(nAmount, fSubtractFeeFromAmount, strAddress); | ||
|
|
||
| // 1. Handle BIP47/RAP Address, check if BIP47 payment code (starts with PM8T) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
bip47 is deprecated, and is disabled by spark HF, please clean your code from it,
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks. Fixed.
src/wallet/rpcwallet.cpp
Outdated
| LogPrintf("Attempting to create Spark mint transaction to %s for amount %s\n", | ||
| sparkAddress.encode(spark::GetNetworkType()), FormatMoney(nAmount)); | ||
|
|
||
| bool success = pwallet->CreateSparkMintTransactions( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
please use MintAndStoreSpark() function, which also commits txs into mempool
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks. All the occurrences fixed. expect one. In that specific case, I don't want to commit to mempool. I just want to see whether able to create a transaction or not. Other than that, all of them changed to MintAndStoreSpark.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
src/wallet/rpcwallet.cpp (1)
1803-1837: sendmany wrapper dropsfromaccount/minconfbehavior and partially contradicts help textThe new implementation converts
sendmany’samountsmap into the JSON shape expected bysendtoaddressand forwards the call, which is a nice unification. However:
fromaccount(param 1) is only written towtx.strFromAccountin a localCWalletTxthat is never passed tosendtoaddress, so account-scoped balance semantics and history labelling are effectively ignored.minconf(param 3) is no longer used at all; the forwardedsendtoaddresscall always uses the wallet’s default confirmation target and balance view, while the help text still documentsminconf.ValidateFeeSubtractionAmountnow runs here and again insidesendtoaddress’s JSON path, which is harmless but redundant.If the intention is to fully deprecate
fromaccount/minconfforsendmany, I’d recommend updating the RPC help to state that these parameters are ignored. Otherwise, consider threadingminconf/account constraints into the forwarded path (e.g., via a dedicated wrapper around coin selection) instead of silently dropping them.Also applies to: 1860-1903
♻️ Duplicate comments (3)
src/wallet/rpcwallet.cpp (3)
681-714: Avoid double-committing MintAndStoreSpark outputs and dropping extra txsHere you call
MintAndStoreSpark(outputs, wtxAndFee, ...)and then manuallyCommitTransactiononly forwtxAndFee[0]. Other call sites (e.g.mintspark,automintspark, JSON branch below) treatMintAndStoreSparkas the factory for all Spark mint transactions and never re‑commit them individually.This pattern risks:
- double‑committing the first transaction if
MintAndStoreSparkalready commits, and/or- silently ignoring additional entries in
wtxAndFeewhen Spark minting is split across multiple transactions.If the intent is to rely on
MintAndStoreSparkfor committing, you can just return the first txid:- // Commit the transaction - CValidationState state; - CReserveKey reservekey(pwallet); - CWalletTx& wtx = wtxAndFee[0].first; - - LogPrintf("Committing Spark mint transaction with %zu inputs and %zu outputs\n", - wtx.tx->vin.size(), wtx.tx->vout.size()); - - if (!pwallet->CommitTransaction(wtx, reservekey, g_connman.get(), state)) { - LogPrintf("Failed to commit transaction: %s\n", state.GetRejectReason()); - throw JSONRPCError(RPC_WALLET_ERROR, - std::string("Error committing transaction: ") + state.GetRejectReason()); - } - - return wtx.GetHash().GetHex(); + // MintAndStoreSpark already creates and commits the transactions; just return the first txid + return wtxAndFee[0].first.GetHash().GetHex();If instead
MintAndStoreSparkonly constructs transactions, switch back to committing all entries (like the JSON path) rather than just index 0.
877-893: sendtoaddress JSON: Spark balance check is incorrect and combined transparent spend is under-validatedIn JSON mode,
sparkRecipientsare handled by minting Spark outputs from transparent UTXOs viaMintAndStoreSpark. The precheck:CAmount sparkBalance = pwallet->sparkWallet->getAvailableBalance(); if (sparkBalance < totalSparkAmount) { throw JSONRPCError(RPC_WALLET_INSUFFICIENT_FUNDS, "Insufficient Spark funds"); }will wrongly fail when the wallet has sufficient transparent balance but little/no Spark balance, even though no Spark coins are spent. This contradicts the intended behavior (sendtoaddress spends transparent funds only, minting Spark as needed).
At the same time, transparent balance is checked separately for transparent and Spark destinations, so it’s possible for both checks to pass individually while the combined
totalTransparentAmount + totalSparkAmountexceedsGetBalance(), leading to Spark mints being committed and the transparent tx subsequently failing.Suggested fix:
- Drop the Spark-balance check here; rely on
EnsureSparkWalletIsAvailable()/spark::IsSparkAllowed()plus the transparent-balance check in the Spark transaction block.- Make the transparent precheck cover both transparent and Spark recipients:
- // Transparent checking - if (!transparentRecipients.empty()) { - CAmount transparentBalance = pwallet->GetBalance(); - CAmount totalTransparentNeeded = totalTransparentAmount; - if (transparentBalance < totalTransparentNeeded) { - throw JSONRPCError(RPC_WALLET_INSUFFICIENT_FUNDS, "Insufficient transparent funds"); - } - } - // Spark checking - if (!sparkRecipients.empty()) { - EnsureSparkWalletIsAvailable(); - CAmount sparkBalance = pwallet->sparkWallet->getAvailableBalance(); - if (sparkBalance < totalSparkAmount) { - throw JSONRPCError(RPC_WALLET_INSUFFICIENT_FUNDS, "Insufficient Spark funds"); - } - } + // Transparent checking (Spark mints also consume transparent UTXOs) + if (!transparentRecipients.empty() || !sparkRecipients.empty()) { + CAmount transparentBalance = pwallet->GetBalance(); + CAmount totalNeeded = totalTransparentAmount + totalSparkAmount; + if (transparentBalance < totalNeeded) { + throw JSONRPCError(RPC_WALLET_INSUFFICIENT_FUNDS, "Insufficient transparent funds"); + } + }This restores the “transparent-funds-only” contract for sendtoaddress while preventing partial execution when mixing transparent and Spark destinations.
Also applies to: 913-922
949-1003: Spark fee subtraction only uses first dummy tx’s fee; sum all entries or use estimatedTotalFeeWhen estimating fees for per-output fee subtraction, you call
CreateSparkMintTransactionswithdummyWtxAndFeeandestimatedTotalFee, but then derivetotalEstimatedFeefrom only the first element:CAmount totalEstimatedFee = dummyWtxAndFee[0].second;If Spark minting is split across multiple transactions, this undercounts the total fee, so the amounts you subtract from outputs won’t match the actual combined fee paid by the final mint transactions.
Use the full estimated fee instead:
- if (!subtractFeeIndices.empty()) { - CAmount totalEstimatedFee = dummyWtxAndFee[0].second; + if (!subtractFeeIndices.empty()) { + // Use the total estimated fee across all dummy transactions + CAmount totalEstimatedFee = 0; + for (const auto& p : dummyWtxAndFee) { + totalEstimatedFee += p.second; + } CAmount feePerOutput = totalEstimatedFee / static_cast<CAmount>(subtractFeeIndices.size()); CAmount remainingFee = totalEstimatedFee % static_cast<CAmount>(subtractFeeIndices.size());(Alternatively, if
estimatedTotalFeeis guaranteed to be the combined fee, you can just use that variable instead of recomputing.)
🧹 Nitpick comments (2)
src/wallet/rpcwallet.cpp (2)
4368-4473: spendspark now requiressubtractFeeper recipient; consider defaulting to false instead of erroringIn the updated
spendsparkparsing you treatsubtractFeeas mandatory for both Spark and transparent recipients:bool subtractFee = false; if (amountAndMemo.exists("subtractFee")) subtractFee = amountAndMemo["subtractFee"].get_bool(); else throw JSONRPCError(RPC_INVALID_PARAMETER, "Invalid parameters, no subtractFee: " + name_);and similarly for
fSubtractFeeFromAmounton transparent outputs.Other RPCs (e.g.
sendtoaddressJSON mode) treatsubtractFeeas optional with a default offalse, which is friendlier and matches the idea of a “special” opt‑in flag. Making it mandatory here may break existing callers that previously omitted it, or new callers that reasonably expect a default.Unless there is a strong reason to force the flag, I’d suggest:
- keeping the local variable defaulted to
false,- not throwing when the field is missing, and
- letting
ValidateFeeSubtractionAmountenforce safety only whensubtractFee == true.That keeps
spendsparkaligned with the other send RPCs while still validating fee-subtraction amounts.Also applies to: 4501-4509
4764-4766: registersparknameadditionalDataindex fix is correct; re-check validity-block calculation nearbyUsing:
std::size_t additionalDataIndex = fTransfer ? 5 : 3; if (request.params.size() > additionalDataIndex) additionalData = request.params[additionalDataIndex].get_str();correctly aligns the optional
additionalDataparameter with the two documented call forms (plain register vs transfer).While you’re in this function, consider double-checking the line just above:
sparkNameData.sparkNameValidityBlocks = numberOfYears * 365*24*24;If this is meant to be
years * 365 days * 24 hours/day * blocks/hour, verify that24blocks/hour matches your actual target (Firo mainnet is typically 5‑minute blocks ⇒ 12 blocks/hour). If not, this might be overestimating validity by a factor of two and could warrant adjusting the multiplier or moving the constant to consensus parameters for clarity.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
src/wallet/rpcwallet.cpp(13 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
src/wallet/rpcwallet.cpp (5)
src/wallet/wallet.cpp (4)
GetMinimumFee(5185-5189)GetMinimumFee(5185-5185)GetMinimumFee(5191-5209)GetMinimumFee(5191-5191)src/sparkname.h (1)
GetInstance(185-185)src/sparkname.cpp (2)
sparkAddress(248-248)sparkAddress(398-398)src/spark/state.cpp (6)
GetNetworkType(93-102)GetNetworkType(93-93)IsSparkAllowed(82-86)IsSparkAllowed(82-82)IsSparkAllowed(88-91)IsSparkAllowed(88-88)src/script/standard.cpp (2)
GetScriptForDestination(369-375)GetScriptForDestination(369-369)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (8)
- GitHub Check: build-guix-x86_64-apple-darwin
- GitHub Check: build-guix-x86_64-w64-mingw32
- GitHub Check: build-guix-arm64-apple-darwin
- GitHub Check: build-guix-aarch64-linux-gnu
- GitHub Check: build-guix-x86_64-linux-gnu
- GitHub Check: build-mac-cmake
- GitHub Check: build-windows-cmake
- GitHub Check: build-linux-cmake
🔇 Additional comments (1)
src/wallet/rpcwallet.cpp (1)
1076-1120: Transparent comment aggregation for multi-recipient sends looks goodThe new logic that collects all per-recipient transparent
comment/comment_tostrings, deduplicates them viastd::set, and joins them with"; "intowtx.mapValue["comment"]and["to"]addresses the previous “only first comment applied” issue without introducing duplication or ordering problems. This is a sensible, backwards-compatible way to surface multiple recipient notes in a single wallet tx entry.
| // Convert sendsparkmany format to spendspark format | ||
| UniValue convertedSendTo(UniValue::VOBJ); | ||
| std::vector<std::string> keys = sendTo.getKeys(); | ||
| BOOST_FOREACH(const std::string& name_, keys) | ||
| { | ||
| CAmount nAmount = AmountFromValue(sendTo[name_]); | ||
| if (nAmount <= 0) | ||
| throw JSONRPCError(RPC_TYPE_ERROR, "Invalid amount for send"); | ||
|
|
||
| // Check if this address should have fee subtracted | ||
| bool fSubtractFeeFromAmount = false; | ||
| for (unsigned int idx = 0; idx < subtractFeeFromAmount.size(); idx++) { | ||
| const UniValue& addr = subtractFeeFromAmount[idx]; | ||
| if (addr.get_str() == name_) | ||
| fSubtractFeeFromAmount = true; | ||
| } | ||
|
|
||
| // Validate that amount is reasonable when fee is being subtracted | ||
| ValidateFeeSubtractionAmount(nAmount, fSubtractFeeFromAmount, name_); | ||
|
|
||
| // Create JSON object for this address in spendspark format | ||
| UniValue addressParams(UniValue::VOBJ); | ||
| addressParams.push_back(Pair("amount", ValueFromAmount(nAmount))); | ||
| addressParams.push_back(Pair("subtractFee", fSubtractFeeFromAmount)); | ||
|
|
||
| // Add global comment as memo for Spark addresses (similar to sendtoaddress behavior) | ||
| if (!comment.empty()) { | ||
| // Check if this is a Spark address (starts with 'st' or '@') | ||
| bool isSparkAddress = false; | ||
| if (name_.length() >= 2 && name_.substr(0, 2) == "st") { | ||
| isSparkAddress = true; | ||
| } else if (!name_.empty() && name_[0] == '@') { | ||
| isSparkAddress = true; // Spark name | ||
| } | ||
|
|
||
| if (isSparkAddress) { | ||
| addressParams.push_back(Pair("memo", comment)); | ||
| LogPrintf("sendsparkmany: Added comment as memo for Spark address %s: %s\n", name_, comment); | ||
| } | ||
| } | ||
|
|
||
| convertedSendTo.push_back(Pair(name_, addressParams)); | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Spark-address detection in sendsparkmany is too narrow (st only), so some Spark addresses miss the memo mapping
In sendsparkmany, the decision to treat an address as Spark for the “global comment → memo” mapping is:
bool isSparkAddress = false;
if (name_.length() >= 2 && name_.substr(0, 2) == "st") {
isSparkAddress = true;
} else if (!name_.empty() && name_[0] == '@') {
isSparkAddress = true; // Spark name
}
...
if (isSparkAddress) {
addressParams.push_back(Pair("memo", comment));
}However, Spark addresses in your own examples (e.g. sr1hk87...) don’t start with "st", so they won’t be recognized here and will not receive the comment-as-memo behavior, even though spendspark will later decode them correctly as Spark addresses.
To keep behavior consistent with sendtoaddress/spendspark, I’d recommend detecting Spark addresses here the same way:
- either try
spark::Address::decodeonname_(or the resolved Spark name) and treat success as Spark, or - at least accept both
"sr"and"st"prefixes if that matches your address formats.
That way, any address that spendspark will treat as Spark also benefits from the memo mapping.
|
@coderabbitai Please do not ask me for any more reviews in this PR. |
|
Tip For best results, initiate chat on the files or code changes. @justanwar, understood! I won't ask you for any more reviews in this PR. Thank you for letting me know. |
levonpetrosyan93
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code looks okay, please test all rpc calls by help strings before merging.
Depends on #1676
PR intention
Address Types (Destinations - WHERE you send TO):
TH8Uv...,a4B2c...)sr1x...)PM8T...)@name→ resolves to Spark address)Enhanced Functionality:
sendtoaddress/sendmany/sendtransparentuse transparent balance,spendspark/sendsparkmanyuse Spark balancesendtoaddresssendtransparentsendtoaddresssendmanysendtoaddressspendsparksendsparkspendsparksendsparkmanyspendsparksendfromspendmanysendtorapaddressmintmintlelantusautoMintlelantusmintsparkautomintsparklelantustosparkTODOs
Example:
firo-cli -regtest sendtoaddress '{"TY2VzjuPgnhjx3G8NmSmUGr9orShiBqrDc":{"amount":0.05, "subtractFee": false, "comment":"addr1"}, "TDkSCRA8jAZ9kyeYC36cQTEeWTQiNa6PtF":{"amount":0.06, "subtractFee": false, "comment":"addr2"}, "st172la2wfu5npkcunze9ufx0n7d4p65xmv7ktqy0ethw4ynl838yr3esd3rmg933prnlga8uhn0p4gn8r9hm4wjj2yd0xzerrvp6zlmn5e9w2jdgns2nlmyd9vzue7p85c0kpxsxsnmerqk":{"amount":0.04, "memo":"spark1", "subtractFee": false}, "st18xeudagz5xnx8zqnprkzfug0we7q5twjfwl7dfsess93u74x49vqkkz5vf6h69eux0pd9qwfm84alwt9x04n3xjvgwtxc9kr7ws4rmkmm39lz7kgfehc9vvrr9m53ze98wgrxksvruh9r":{"amount":0.03, "memo":"spark2", "subtractFee": true}}'subtractFeeis set totrue, it's applied to the whole transaction fee, not just that specific output.Example:
firo-cli -regtest sendtoaddress '{"st172la2wfu5npkcunze9ufx0n7d4p65xmv7ktqy0ethw4ynl838yr3esd3rmg933prnlga8uhn0p4gn8r9hm4wjj2yd0xzerrvp6zlmn5e9w2jdgns2nlmyd9vzue7p85c0kpxsxsnmerqk":{"amount":0.2, "memo":"spark1", "subtractFee": false}, "st18xeudagz5xnx8zqnprkzfug0we7q5twjfwl7dfsess93u74x49vqkkz5vf6h69eux0pd9qwfm84alwt9x04n3xjvgwtxc9kr7ws4rmkmm39lz7kgfehc9vvrr9m53ze98wgrxksvruh9r":{"amount":0.1, "memo":"spark2", "subtractFee": true}}'