Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: single file build #181

Merged
merged 24 commits into from
Mar 11, 2025
Merged

feat: single file build #181

merged 24 commits into from
Mar 11, 2025

Conversation

zardoy
Copy link
Owner

@zardoy zardoy commented Aug 18, 2024

Summary by CodeRabbit

  • New Features

    • Introduced a convenient single-file mode that provides a streamlined one-page version for simplified deployment.
    • Enhanced the panorama view with dynamic camera animations and a demo world for a richer, interactive experience.
    • Upgraded the sign editor with a modern text editing interface for improved usability.
    • Added a new workflow for building and bundling a single-file version of the project.
  • Refactor

    • Improved proxy configuration to enforce secure connections.
    • Optimized build and release processes to deliver multiple release artifacts for easier access.
    • Modified local storage handling to accommodate different environments based on the build mode.

Copy link

codesandbox bot commented Aug 18, 2024

Review or Edit in CodeSandbox

Open the branch in Web EditorVS CodeInsiders

Open Preview

@zardoy zardoy changed the title One file build feat: one file build Aug 18, 2024
@zardoy
Copy link
Owner Author

zardoy commented Sep 8, 2024

/update

6 similar comments
@zardoy
Copy link
Owner Author

zardoy commented Sep 8, 2024

/update

@zardoy
Copy link
Owner Author

zardoy commented Sep 8, 2024

/update

@zardoy
Copy link
Owner Author

zardoy commented Sep 8, 2024

/update

@zardoy
Copy link
Owner Author

zardoy commented Sep 8, 2024

/update

@zardoy
Copy link
Owner Author

zardoy commented Sep 9, 2024

/update

@zardoy
Copy link
Owner Author

zardoy commented Sep 9, 2024

/update

@zardoy
Copy link
Owner Author

zardoy commented Sep 11, 2024

/update

@zardoy zardoy changed the title feat: one file build feat: single file build Jan 25, 2025
@zardoy
Copy link
Owner Author

zardoy commented Jan 25, 2025

/review

PR Reviewer Guide 🔍

Here are some key observations to aid the review process:

⏱️ Estimated effort to review: 4 🔵🔵🔵🔵⚪
🧪 PR contains tests
🔒 No security concerns identified
⚡ Recommended focus areas for review

File Patching

The code modifies a CSS file by removing font URLs which could break font loading functionality. This modification should be validated to ensure fonts still work correctly in single file build mode.

const patchCssFile = 'node_modules/pixelarticons/fonts/pixelart-icons-font.css'
const text = fs.readFileSync(patchCssFile, 'utf8')
fs.writeFileSync(patchCssFile, text.replaceAll("url('pixelart-icons-font.ttf?t=1711815892278') format('truetype'),", ""), 'utf8')
Data Processing

Complex recipe data processing logic was added that transforms Minecraft recipe data. The transformation logic should be carefully reviewed to ensure it handles all recipe formats correctly.

function processRecipes (current, prev, getData, version) {
  // can require the same multiple times per different versions
  if (current._proccessed) return
  const items = getData('items')
  const blocks = getData('blocks')
  const itemsIdsMap = Object.fromEntries(items.map((b) => [b.id, b.name]))
  const blocksIdsMap = Object.fromEntries(blocks.map((b) => [b.id, b.name]))
  for (const key of Object.keys(current)) {
    const mapId = (id) => {
      if (typeof id !== 'string' && typeof id !== 'number') throw new Error('Incorrect type')
      const mapped = itemsIdsMap[id] ?? blocksIdsMap[id]
      if (!mapped) {
        throw new Error(`No item/block name with id ${id}`)
      }
      return mapped
    }
    const processRecipe = (obj) => {
      // if (!obj) return
      // if (Array.isArray(obj)) {
      //   obj.forEach((id, i) => {
      //     obj[i] = mapId(obj[id])
      //   })
      // } else if (obj && typeof obj === 'object') {
      //   if (!'count metadata id'.split(' ').every(x => x in obj)) {
      //     throw new Error(`process error: Unknown deep object pattern: ${JSON.stringify(obj)}`)
      //   }
      //   obj.id = mapId(obj.id)
      // } else {
      //   throw new Error('unknown type')
      // }
      const parseRecipeItem = (item) => {
        if (typeof item === 'number') return mapId(item)
        if (Array.isArray(item)) return [mapId(item), ...item.slice(1)]
        if (!item) {
          return item
        }
        if ('id' in item) {
          item.id = mapId(item.id)
          return item
        }
        throw new Error('unhandled')
      }
      const maybeProccessShape = (shape) => {
        if (!shape) return
        for (const shapeRow of shape) {
          for (const [i, item] of shapeRow.entries()) {
            shapeRow[i] = parseRecipeItem(item)
          }
        }
      }
      if (obj.result) obj.result = parseRecipeItem(obj.result)
      maybeProccessShape(obj.inShape)
      maybeProccessShape(obj.outShape)
      if (obj.ingredients) {
        for (const [i, ingredient] of obj.ingredients.entries()) {
          obj.ingredients[i] = parseRecipeItem(ingredient)
        }
      }
    }
    try {
      const name = mapId(key)
      for (const [i, recipe] of current[key].entries()) {
        try {
          processRecipe(recipe)
        } catch (err) {
          console.warn(`${version} [warn] Removing incorrect recipe: ${err}`)
          delete current[i]
        }
      }
      current[name] = current[key]
    } catch (err) {
      console.warn(`${version} [warn] Removing incorrect recipe: ${err}`)
    }
    delete current[key]
  }
  current._proccessed = true
}
Resource Loading

New demo world initialization code was added that loads and renders blocks. The block selection and rendering logic should be reviewed for correctness and performance.

const initDemoWorld = async () => {
  const version = '1.21.1'
  preloadAllMcData()
  console.time('load mc-data')
  await loadMinecraftData(version)
  console.timeEnd('load mc-data')
  if (miscUiState.gameLoaded) return
  console.time('load scene')
  const world = getSyncWorld(version)
  const PrismarineBlock = require('prismarine-block')
  const Block = PrismarineBlock(version)
  const fullBlocks = loadedData.blocksArray.filter(block => {
    // if (block.name.includes('leaves')) return false
    if (/* !block.name.includes('wool') &&  */!block.name.includes('stained_glass')/*  && !block.name.includes('terracotta') */) return false
    const b = Block.fromStateId(block.defaultState, 0)
    if (b.shapes?.length !== 1) return false
    const shape = b.shapes[0]
    return shape[0] === 0 && shape[1] === 0 && shape[2] === 0 && shape[3] === 1 && shape[4] === 1 && shape[5] === 1
  })
  const Z = -15
  const sizeX = 100
  const sizeY = 100
  for (let x = -sizeX; x < sizeX; x++) {
    for (let y = -sizeY; y < sizeY; y++) {
      const block = fullBlocks[Math.floor(Math.random() * fullBlocks.length)]
      world.setBlockStateId(new Vec3(x, y, Z), block.defaultState)
    }
  }
  viewer.camera.updateProjectionMatrix()
  viewer.camera.position.set(0.5, sizeY / 2 + 0.5, 0.5)
  viewer.camera.rotation.set(0, 0, 0)
  const initPos = new Vec3(...viewer.camera.position.toArray())
  const worldView = new WorldDataEmitter(world, 2, initPos)
  // worldView.addWaitTime = 0
  await viewer.world.setVersion(version)
  viewer.connect(worldView)
  void worldView.init(initPos)
  await viewer.world.waitForChunksToRender()
  const abortController = new AbortController()
  // add small camera rotation to side on mouse move depending on absolute position of the cursor
  const { camera } = viewer
  const initX = camera.position.x
  const initY = camera.position.y
  let prevTwin: tweenJs.Tween<THREE.Vector3> | undefined
  document.body.addEventListener('pointermove', (e) => {
    if (e.pointerType !== 'mouse') return
    const pos = new THREE.Vector2(e.clientX, e.clientY)
    const SCALE = 0.2
    /* -0.5 - 0.5 */
    const xRel = pos.x / window.innerWidth - 0.5
    const yRel = -(pos.y / window.innerHeight - 0.5)
    prevTwin?.stop()
    const to = {
      x: initX + (xRel * SCALE),
      y: initY + (yRel * SCALE)
    }
    prevTwin = new tweenJs.Tween(camera.position).to(to, 0) // todo use the number depending on diff // todo use the number depending on diff
    // prevTwin.easing(tweenJs.Easing.Exponential.InOut)
    prevTwin.start()
    camera.updateProjectionMatrix()
  }, {
    signal: abortController.signal
  })

  unloadPanoramaCallbacks.push(() => {
    abortController.abort()
  })

  console.timeEnd('load scene')
}

Copy link

coderabbitai bot commented Mar 11, 2025

Caution

Review failed

The pull request is closed.

Walkthrough

This pull request introduces a new GitHub Actions workflow for building a single-file version of the project and updates various configurations and scripts. Changes include adjustments to build scripts, environment-based conditional logic (using the SINGLE_FILE_BUILD variable), resource handling in HTML and worker initialization, improvements in data processing and recipe handling, and refinements within React components for editor integration and data loading.

Changes

Files/Path(s) Summary of Changes
.github/workflows/build-single-file.yml
.github/workflows/publish.yml
Added a new workflow to manually trigger a single-file build that checks out the repo, sets up Node.js and pnpm, installs dependencies, builds the file, moves it to minecraft.html, and uploads it as an artifact; publish workflow updated to include the single-file build step.
config.json
package.json
rsbuild.config.ts
renderer/rsbuildSharedConfig.ts
Updated proxy configuration (adding protocol in config.json); added new build scripts (build-single-file and prepare-project), modified release file attachments, and introduced conditional build settings based on SINGLE_FILE_BUILD.
index.html
renderer/viewer/lib/worldrendererCommon.ts
src/serviceWorker.ts
Removed favicon, meta, and manifest links from HTML; modified worker initialization to conditionally retrieve code from an HTML element when in single-file mode; updated service worker registration to consider single-file build conditions.
scripts/genLargeDataAliases.ts
scripts/makeOptimizedMcData.mjs
scripts/testOptimizedMcdata.ts
src/optimizeJson.ts
Added new functions for generating large data aliases and processing recipes; enhanced JSON data restoration with improved error handling and data mapping.
src/connect.ts
src/optionsStorage.ts
src/panorama.ts
src/devtools.ts
src/shims/minecraftData.ts
Renamed and reworked data loading functions (e.g. renaming to loadMinecraftData and using dynamic imports); adjusted local storage key based on environment; refined panorama and shimming logic.
src/react/MainMenuRenderApp.tsx
src/react/SignEditor.stories.tsx
src/react/SignEditor.tsx
src/react/SignEditorProvider.tsx
Updated version status logic and refactored SignEditor props to replace isWysiwyg with an explicitly provided ProseMirrorView; integrated asynchronous loading of the editor view in the provider.

Sequence Diagram(s)

sequenceDiagram
    participant U as User
    participant GH as GitHub Actions
    participant Repo as Repository
    participant Build as Build Environment
    participant Art as Artifact Storage

    U->>GH: Trigger workflow_dispatch
    GH->>Repo: Checkout repository
    GH->>Build: Setup Node.js (v22) & Install pnpm
    Build->>Repo: Install dependencies (pnpm install)
    Build->>Build: Execute build-single-file command<br>(build & move file to minecraft.html)
    Build->>Art: Upload minecraft.html as artifact
Loading
sequenceDiagram
    participant App as Application Code
    participant Env as Environment Variable (SINGLE_FILE_BUILD)
    participant HTML as HTML Element ("mesher-worker-code")
    participant Worker as Worker Instance

    App->>Env: Check if SINGLE_FILE_BUILD == true
    alt Single-file mode enabled
        App->>HTML: Retrieve worker code from element
        HTML->>App: Return inline worker script
        App->>App: Create Blob & generate URL
        App->>Worker: Instantiate Worker using Blob URL
    else Default mode
        App->>Worker: Instantiate Worker using predefined script path
    end
Loading

Possibly related PRs

Suggested labels

Review effort 4/5, Review effort 3/5

Poem

Hop along, my code-lined friend,
A workflow born with a bunny’s trend.
Single-file builds now hop on stage,
With blobs and scripts setting the gauge.
My whiskers twitch with joyful art –
A techy tale straight from my rabbit heart!
(_/), (•‿•), (づ。◕‿‿◕。)づ

Warning

There were issues while running some tools. Please review the errors and either fix the tool’s configuration or disable the tool if it’s a critical failure.

🔧 ESLint

If the error stems from missing dependencies, add them to the package.json file. For unrecoverable errors (e.g., due to private dependencies), disable the tool in the CodeRabbit configuration.

Scope: all 2 workspace projects
 ERR_PNPM_OPTIONAL_DEPS_REQUIRE_PROD_DEPS  Optional dependencies cannot be installed without production dependencies


📜 Recent review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between a8bca03 and 06b295a.

📒 Files selected for processing (1)
  • src/panorama.ts (3 hunks)
✨ Finishing Touches
  • 📝 Generate Docstrings

🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

@zardoy
Copy link
Owner Author

zardoy commented Mar 11, 2025

/deploy

Copy link

Deployed to Vercel Preview: https://prismarine-q80bkayji-zaro.vercel.app
Playground
Storybook

@zardoy zardoy linked an issue Mar 11, 2025 that may be closed by this pull request
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 5

🧹 Nitpick comments (19)
renderer/viewer/lib/worldrendererCommon.ts (1)

194-201: The worker initialization approach for single file builds looks good, but consider adding error handling

The implementation properly loads the worker script from an HTML element when in single file build mode, which is essential for embedding all code in a single file. However, there are non-null assertions that could be risky.

Consider adding error handling to gracefully handle missing elements:

 let worker: any
 if (process.env.SINGLE_FILE_BUILD) {
-  const workerCode = document.getElementById('mesher-worker-code')!.textContent!
+  const mesherElement = document.getElementById('mesher-worker-code')
+  if (!mesherElement || !mesherElement.textContent) {
+    console.error('Missing mesher-worker-code element or content')
+    throw new Error('Failed to initialize worker: missing mesher-worker-code element')
+  }
+  const workerCode = mesherElement.textContent
   const blob = new Blob([workerCode], { type: 'text/javascript' })
   worker = new Worker(window.URL.createObjectURL(blob))
 } else {
   worker = new Worker(src)
 }
scripts/genLargeDataAliases.ts (1)

30-51: Consider adding error handling to the decompression function

The decompression function lacks error handling, which could cause silent failures or difficult-to-debug issues if the input is invalid or if the decompression fails.

function decompressFromBase64(input) {
    console.time('decompressFromBase64')
    // Decode the Base64 string
+   try {
        const binaryString = atob(input);
        const len = binaryString.length;
        const bytes = new Uint8Array(len);

        // Convert the binary string to a byte array
        for (let i = 0; i < len; i++) {
            bytes[i] = binaryString.charCodeAt(i);
        }

        // Decompress the byte array
        const decompressedData = pako.inflate(bytes, { to: 'string' });

        console.timeEnd('decompressFromBase64')
        return decompressedData;
+   } catch (error) {
+       console.error('Error during decompression:', error);
+       console.timeEnd('decompressFromBase64')
+       throw new Error(`Failed to decompress data: ${error.message}`);
+   }
}
.github/workflows/build-single-file.yml (1)

1-32: Well-structured GitHub Action workflow

The workflow is clean, follows best practices, and handles the complete process from building to artifact upload. Note that the workflow uses checkout@master, which might be less predictable than using a specific version.

Consider using a specific version of actions/checkout:

- uses: actions/checkout@master
+ uses: actions/checkout@v4

This provides better version control and predictability for your CI/CD pipeline.

src/react/SignEditorProvider.tsx (1)

25-25: State initialization with null as any could be improved.

Using null as any for state initialization bypasses TypeScript's type safety. Consider defining a proper type or using a more specific type assertion.

-  const [proseMirrorView, setProseMirrorView] = useState(null as any)
+  const [proseMirrorView, setProseMirrorView] = useState<typeof import('./prosemirror-markdown').ProseMirrorView | null>(null)
src/connect.ts (1)

57-62: Consider avoiding global assignments.
Though assigning mcData, PrismarineBlock, PrismarineItem, and pathfinder to the window object can be convenient, it may introduce potential conflicts with other scripts or libraries. Explore namespaced objects or dedicated singletons to contain these references.

src/shims/minecraftData.ts (1)

26-26: Resolve data loading concurrency.
Calling await importLargeData('mcData') and immediately resolving it is fine, but confirm that parallel calls won't interfere with the shared optimizedDataResolver.

rsbuild.config.ts (5)

18-18: Ensure environment variable reliability.
const SINGLE_FILE_BUILD = process.env.SINGLE_FILE_BUILD === 'true' is straightforward, but confirm that your build environment reliably sets SINGLE_FILE_BUILD and won't introduce false positives.


20-24: Implicit override of font references.
Patching the Pixelart Icons font CSS is a creative solution, but be aware this might break if new versions of the font or file references change. You may want to handle this gracefully or detect the correct snippet to replace.


65-104: Refactor repetitive HTML tag injection logic.
Injecting tags and meta tags conditionally can become bloated. Consider a function or template approach to encapsulate the logic for toggling single-file vs. multi-file references.


114-120: Evaluate potential performance impact of single-file build outputs.
Inlining all scripts, styles, and large assets (dataUriLimit) can create huge files, leading to long initial load times. Monitor final bundle size and consider lazy loading or chunking for large assets.


205-220: Post-build single-file transformation.
Replacing favicon.png and loading-bg.jpg with inlined data is fine, but watch out for large size expansions. Also confirm that mesher.js is appended properly and that no script collisions occur.

src/panorama.ts (3)

15-15: Consider dynamic importing of loadMinecraftData.

If single-file size is a concern, you could dynamically import loadMinecraftData inside initDemoWorld() to reduce initial load time.


149-154: Wrap callback unloading with error handling.

One failing callback could block the others. Surround each callback with a try/catch block to improve robustness.

  for (const unloadPanoramaCallback of unloadPanoramaCallbacks) {
-    unloadPanoramaCallback()
+    try {
+      unloadPanoramaCallback()
+    } catch (err) {
+      console.error('Failed to unload panorama resource:', err)
+    }
  }
  unloadPanoramaCallbacks = []

163-232: Potential heavy loop and large data usage in initDemoWorld.

Filling a 200×200 area with random blocks could impact performance in the browser. Consider chunk-based loading or limiting the size for better responsiveness. Also verify references like loadedData to prevent runtime errors if it’s not loaded in single-file mode.

scripts/makeOptimizedMcData.mjs (1)

46-58: Turn hard-coded compression option into an environment-based toggle.

Forcing compressedOutput = true might hamper troubleshooting. Consider using !!process.env.SINGLE_FILE_BUILD or a config flag instead of a permanent boolean.

- const compressedOutput = true
+ const compressedOutput = !!process.env.SINGLE_FILE_BUILD
src/optimizeJson.ts (4)

21-31: Consider normalizing the _proccessed key name.

The current key is spelled as _proccessed, which may be a spelling mistake and could cause confusion. You might rename it to _processed to maintain clarity.

-      if (key === '_proccessed') {
+      if (key === '_processed') {

32-39: Validate the error message for mapId.

While the mapping logic is sound, consider refining the error message to clarify that the unmatched ID is actually a name or numeric ID. This can help troubleshoot mapping issues more quickly.


106-115: Use more specific parameter types for restoreMinecraftData.

All parameters are typed as any and string, which can reduce type safety. Consider introducing interfaces or stricter types to improve maintainability and catch potential data shape issues earlier.


258-258: Document the new dataRestorer parameter.

This new optional parameter is helpful for post-processing the restored data. Add JSDoc or inline comments to clarify its purpose for future maintainers.

📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between f9c042b and a8bca03.

⛔ Files ignored due to path filters (1)
  • pnpm-lock.yaml is excluded by !**/pnpm-lock.yaml
📒 Files selected for processing (24)
  • .github/workflows/build-single-file.yml (1 hunks)
  • .github/workflows/publish.yml (1 hunks)
  • config.json (1 hunks)
  • index.html (0 hunks)
  • package.json (3 hunks)
  • renderer/rsbuildSharedConfig.ts (2 hunks)
  • renderer/viewer/lib/worldrendererCommon.ts (1 hunks)
  • rsbuild.config.ts (6 hunks)
  • scripts/genLargeDataAliases.ts (1 hunks)
  • scripts/genShims.ts (1 hunks)
  • scripts/makeOptimizedMcData.mjs (6 hunks)
  • scripts/testOptimizedMcdata.ts (3 hunks)
  • src/connect.ts (3 hunks)
  • src/devtools.ts (1 hunks)
  • src/index.ts (5 hunks)
  • src/optimizeJson.ts (4 hunks)
  • src/optionsStorage.ts (2 hunks)
  • src/panorama.ts (4 hunks)
  • src/react/MainMenuRenderApp.tsx (1 hunks)
  • src/react/SignEditor.stories.tsx (2 hunks)
  • src/react/SignEditor.tsx (3 hunks)
  • src/react/SignEditorProvider.tsx (2 hunks)
  • src/serviceWorker.ts (1 hunks)
  • src/shims/minecraftData.ts (3 hunks)
💤 Files with no reviewable changes (1)
  • index.html
🧰 Additional context used
🪛 Biome (1.9.4)
src/optimizeJson.ts

[error] 93-93: The catch clause that only rethrows the original error is useless.

An unnecessary catch clause can be confusing.
Unsafe fix: Remove the try/catch clause.

(lint/complexity/noUselessCatch)


[error] 99-99: The catch clause that only rethrows the original error is useless.

An unnecessary catch clause can be confusing.
Unsafe fix: Remove the try/catch clause.

(lint/complexity/noUselessCatch)

⏰ Context from checks skipped due to timeout of 90000ms (1)
  • GitHub Check: build-and-deploy
🔇 Additional comments (46)
config.json (1)

4-4: Improvement to proxy URL specification

The change from a hostname to a full URL with protocol is a good practice as it removes ambiguity about which protocol to use. This is particularly important for a single file build where the application may need to make cross-origin requests, and using HTTPS is a security best practice.

.github/workflows/publish.yml (1)

56-57: Added build step for single-file version

This step properly implements the single-file build process, generating a standalone HTML file named minecraft.html. This aligns with the PR objective and creates an artifact that can be used independently of the rest of the distribution.

Since this is a new build process, it would be good to verify that all necessary assets are included in the single file:

#!/bin/bash
# Check if the build-single-file script exists in package.json
jq '.scripts["build-single-file"]' package.json

# Look for SINGLE_FILE_BUILD environment variable usage
rg "SINGLE_FILE_BUILD" --type ts
src/react/SignEditor.stories.tsx (1)

2-2: Updated SignEditor stories to use ProseMirrorView directly

The code has been refactored to pass the ProseMirrorView component directly instead of using the isWysiwyg flag. This change improves type safety and makes the component API more explicit about its dependencies.

Make sure that this change is consistent with the implementation in the SignEditor component:

#!/bin/bash
# Check the SignEditor component props to ensure they match
cat src/react/SignEditor.tsx | rg "interface Props|type Props"

Also applies to: 23-23

scripts/genShims.ts (1)

18-18: LGTM: Clean code formatting

This additional whitespace improves readability by separating the import/initialization logic from the resource processing logic.

src/react/MainMenuRenderApp.tsx (1)

90-92: LGTM: Proper handling for single file build mode

The code correctly adds a new condition to check for the single file build mode environment variable, while preserving the existing logic for development and production modes.

src/serviceWorker.ts (1)

5-5: LGTM: Properly disables service worker for single file builds

The change correctly prevents service worker registration when in single file build mode, which is appropriate since service workers typically handle caching and offline functionality that may not be needed in this context.

scripts/genLargeDataAliases.ts (2)

3-28: Clean implementation of data compression strategy

This function elegantly handles switching between compressed and raw data sources based on the build configuration. The approach allows for flexibility while minimizing the final bundle size when compression is enabled.


53-59: Good standalone execution support

The self-executing logic is well-implemented, making the script usable both as an imported module and as a standalone command-line tool.

scripts/testOptimizedMcdata.ts (3)

2-2: Good API refactoring to use specialized function

The change from using a generic JsonOptimizer.restoreData to the more specialized restoreMinecraftData improves code semantics.


11-11:

✅ Verification successful

Verification needed for the updated data restoration logic

The implementation now uses restoreMinecraftData instead of the previous approach, but the function signature and behavior might be different.


🏁 Script executed:

#!/bin/bash
# Verify that restoreMinecraftData is correctly implemented

# Look for the restoreMinecraftData implementation
echo "Checking restoreMinecraftData implementation:"
rg -A 10 "export const restoreMinecraftData" src/

Length of output: 799


Verification Passed – Data Restoration Logic is Correct

  • Verified that restoreMinecraftData is implemented in src/optimizeJson.ts with the expected signature.
  • The function accepts (allVersionData: any, type: string, version: string) and includes logic to handle recipes type specifically.
  • The call in scripts/testOptimizedMcdata.ts using structuredClone(json) matches the implementation requirements.

95-96:

✅ Verification successful

Inconsistent parameter passing for restoreData

Both calls to JsonOptimizer.restoreData now pass undefined as the third parameter. Make sure this change is consistent with how the function is defined.


🏁 Script executed:

#!/bin/bash
# Check the signature of restoreData in JsonOptimizer

# Find the definition of restoreData
echo "Looking for restoreData implementation:"
rg -A 5 "restoreData" src/optimizeJson.ts

Length of output: 1000


Confirmation: Parameter Passing for JsonOptimizer.restoreData is Consistent

After reviewing the implementation in src/optimizeJson.ts, it's clear that the restoreData function is defined to accept a third parameter—dataRestorer—as an optional callback (((data) => void) | undefined). In the test file scripts/testOptimizedMcdata.ts, passing undefined for this parameter on lines 95-96 is fully consistent with the function's signature.

  • The definition in src/optimizeJson.ts confirms the optional nature of the third parameter.
  • Both test calls passing undefined align with the expected usage.

There’s no inconsistency here, so no changes are needed.

renderer/rsbuildSharedConfig.ts (3)

81-86: Good resource path handling for Minecraft data files

The conditional path handling for Minecraft data files ensures that the correct resources are loaded based on the actual file paths, improving compatibility and preventing potential issues.


101-104: Added support for .txt files as asset/source

This change ensures text files are properly handled in the build process, consistent with how other non-JavaScript resources are handled.


110-112:

❓ Verification inconclusive

Key change for single-file build: eager dynamic imports

Setting dynamicImportMode to 'eager' when SINGLE_FILE_BUILD is true ensures that all dynamic imports are resolved at build time rather than runtime, which is essential for creating a self-contained single file. This is a critical part of implementing the single-file build feature.

Check that dynamicImportMode is properly documented and compatible with your version of rspack:


🏁 Script executed:

#!/bin/bash
# Check rspack version and compatibility with dynamicImportMode='eager'

# Get the rspack version
echo "Checking @rspack/core version:"
grep "@rspack/core" package.json

Length of output: 103


Action Required: Verify @rspack/core dependency and dynamicImportMode compatibility

The execution of the version-check script returned no output, which suggests that the dependency for @rspack/core may not be explicitly declared in your package.json (or it’s referenced differently). Please confirm that:

  • The package (or an equivalent rspack dependency) is correctly declared in your package.json and supports the dynamicImportMode: 'eager' option.
  • The rspack documentation has been reviewed to ensure that enabling eager dynamic imports is compatible with your current version/configuration.

Manually verify the dependency setup and consult the rspack docs for confirmation.

src/react/SignEditorProvider.tsx (2)

77-81: Good implementation of conditional dynamic import.

The conditional import approach correctly handles the single file build scenario. When not in a single file build, it dynamically imports the ProseMirrorView component only when needed.


85-85: Component prop passed correctly based on condition.

The conditional passing of ProseMirrorView to the SignEditor component is well implemented, maintaining the intended functionality while adapting for the single file build.

src/optionsStorage.ts (3)

166-167: Good implementation of environment-dependent storage key.

Using a different localStorage key for the single file build prevents conflicts when opening HTML files locally in the browser, which is a thoughtful consideration for different deployment scenarios.


171-171: Options correctly migrated using the environment-dependent storage key.

The migration function now correctly uses the dynamic localStorageKey to retrieve stored options.


190-190: Consistent use of storage key for saving options.

The code correctly uses the same dynamic localStorageKey when saving options back to localStorage, ensuring consistency in how data is persisted.

src/index.ts (5)

90-90: Imports updated to match renamed function.

The import statement correctly references the renamed loadMinecraftData function from the ./connect module, maintaining the functionality while adapting for the single file build.


114-114: New data loading approach using importLargeData.

The importLargeData function has been introduced to handle large data files, which is likely part of the single file build strategy for managing resource size.


417-417: Function call updated to match renamed function.

The code now correctly calls the renamed loadMinecraftData function instead of the previous downloadMcDataOnConnect.


433-433: Blockstates models loading updated to use importLargeData.

The approach for loading blockstates models has been updated to use the new importLargeData function instead of direct import, which is consistent with the single file build strategy.


875-875: Status clearing after progress completion.

The code now clears the loading screen status after the progress is completed, which improves the user experience by not leaving stale status messages.

src/react/SignEditor.tsx (3)

4-4: Type import updated correctly.

The import statement has been correctly updated to be a type-only import for ProseMirrorView, which is appropriate since it's now used only for type checking purposes in this file.


15-15: Props interface updated to use ProseMirrorView type.

The Props interface has been updated to replace the boolean isWysiwyg property with the ProseMirrorView component type, which provides better type safety and a more direct representation of what's needed.


25-26: Component refactored to derive WYSIWYG state from props.

The component now correctly derives the WYSIWYG state from the presence of the ProseMirrorView prop rather than having it passed as a separate boolean. This is a good refactoring that reduces prop redundancy.

src/connect.ts (4)

8-11: Approve new imports for local modules.
These lines cleanly integrate references to local modules (globalState, optionsStorage, and botSoundSystem) and appear consistent.


3-7:

❓ Verification inconclusive

Verify imports' existence and consistency.
These new imports (MinecraftData, PrismarineBlock, PrismarineItem, pathfinder, importLargeData) look correct, but please ensure that all referenced dependencies are properly installed and that the bundler or runtime can resolve them without issues.

Run the following script to verify install references:


🏁 Script executed:

#!/bin/bash
# Verify that the newly imported packages exist in package.json
rg "minecraft-data|prismarine-block|prismarine-item|mineflayer-pathfinder|large-data-aliases" package.json -A 5

Length of output: 1266


Imports Verification: Dependency Entries Exist & Local Alias File Check Needed

  • The external dependencies (minecraft-data, prismarine-block, prismarine-item, and mineflayer-pathfinder) are correctly referenced in the project's package.json.
  • The local import (importLargeData from ../generated/large-data-aliases) should be confirmed by ensuring that the corresponding generated file exists in the repository.

Please verify that the generated file for large-data-aliases is present and correctly referenced in your build configuration.


64-66:

✅ Verification successful

Confirm viewer scope.
This code references viewer.world but doesn't import or define viewer within this file. If it's a global variable, ensure there's no risk of race conditions or undefined references.


🏁 Script executed:

#!/bin/bash
# Search for the definition of `viewer` to ensure it's declared before use
rg "(\s|^)viewer(\s|=|.)"

Length of output: 21720


Global "viewer" confirmed and initialization order verified.
We found that viewer is declared as a global in src/globals.d.ts and instantiated in src/index.ts, which is consistent with its usage in src/connect.ts. As long as the global instantiation completes before this file’s execution, there isn’t a risk of race conditions or undefined references.


47-48:

❓ Verification inconclusive

Check updated function signature usage.
Renaming downloadMcDataOnConnect to loadMinecraftData with an additional importBlockstatesModels parameter is a breaking change. Ensure that all invocations have been updated across the codebase.


🏁 Script executed:

#!/bin/bash
# Search all references to loadMinecraftData for correct parameter usage
rg "loadMinecraftData"

Length of output: 453


Action: Verify Consistency of New loadMinecraftData Usage

The function has been renamed and now accepts an additional optional parameter (importBlockstatesModels) with a default value of false. The recent scan confirms the following:

  • src/panorama.ts: The invocation await loadMinecraftData(version, true) explicitly sets the new parameter.
  • src/index.ts: The invocation await loadMinecraftData(version) relies on the default value (false).

Please ensure that the default behavior in src/index.ts is intentional and aligns with the overall design since relying on the default may cause unexpected behavior if the previous implementation assumed different logic.

src/shims/minecraftData.ts (2)

57-57: Validate restored data structure.
restoreMinecraftData(optimizedDataResolver.resolvedData, dataType, version) must ensure it merges or replaces the data in a type-safe manner. Verify no silent type mismatches occur.


2-5:

❓ Verification inconclusive

Check newly introduced data-loading imports.
Imports for restoreMinecraftData and importLargeData are newly introduced. Confirm that these modules are exported under the correct paths and that compatibility aligns with your bundler configuration.


🏁 Script executed:

#!/bin/bash
# Verify that restoreMinecraftData & importLargeData are available in the indicated modules
rg "export function restoreMinecraftData" src/optimizeJson.ts
rg "export function importLargeData" generated/large-data-aliases.*

Length of output: 206


Verify Data-Loading Import Paths
The export for restoreMinecraftData in src/optimizeJson.ts appears to be in order. However, the search for importLargeData in the expected module at ../../generated/large-data-aliases returned a “No such file or directory” error. Please manually verify that the file exists (and is correctly named and generated) with the proper export for importLargeData, and that the import path aligns with your bundler’s configuration. If the file is located elsewhere or generated under a different name, update the import accordingly.

rsbuild.config.ts (6)

57-59: Double-check config source fallback and favicon path.
configSource toggles based on SINGLE_FILE_BUILD. Verify that 'BUNDLED' config usage meets your runtime requirements. Also, confirm 'favicon.png' is available in all builds.


132-134: Check for environment variable collisions.
Including SINGLE_FILE_BUILD and SINGLE_FILE_BUILD_MODE in the global environment is okay, but be aware these might collide or be overwritten if multiple scripts set them.


158-158: Build plugin 'setup' approach looks good.
The approach for hooking into onBeforeBuild and onAfterBuild is consistent with typical plugin flows.


162-170: Check data optimization scripts.
Logic for checking existence of minecraft-data-optimized.json and mc-assets-compressed.js is sound. Ensure that genLargeDataAliases(SINGLE_FILE_BUILD) matches the intended optimization workflow.


14-16:

✅ Verification successful

Confirm new dependencies and usage.
genLargeDataAliases, sharp, and supportedVersions imports are new. Verify that they are installed, compatible with your environment (especially sharp on non-native runtimes), and properly referenced.


🏁 Script executed:

#!/bin/bash
# Check package.json or lock file for these dependencies
rg "genLargeDataAliases|sharp|supportedVersions" package.json

Length of output: 214


Dependency Verification Confirmation

  • Confirmed that sharp is correctly declared in package.json (version ^0.33.5).
  • Noted that genLargeDataAliases is referenced in the prepare-project script, confirming its expected inclusion as a local utility.
  • Verified that supportedVersions is a local module imported from ./src/supportedVersions.mjs, so no external dependency declaration is needed.
  • Please ensure that the sharp usage is validated on non-native runtimes if applicable.

221-231:

❓ Verification inconclusive

Validate service worker generation logic.
When not in single-file mode, the generateSW call should produce a valid service-worker. Double-check that additionalManifestEntries and caching logic won't conflict with your single-file approach.


🏁 Script executed:

#!/bin/bash
#!/bin/bash
# Inspect service-worker generation logic
rg "workbox-build|generateSW" . -A 10

Length of output: 4123


Confirm Service Worker Generation for Multi-File vs. Single-File Builds
The current implementation uses Workbox’s generateSW only when service workers are enabled (i.e. when disableServiceWorker is false). Please verify that:

• In multi-file builds, the generated service worker (with options like skipWaiting and clientsClaim) is fully functional.
• The manifest entries returned by getSwAdditionalEntries() and the empty globPatterns setting appropriately cover your caching needs without conflicting with your single-file build process.

If the single-file build should disable or adjust service worker logic, consider gating this call accordingly.

src/panorama.ts (3)

8-11: Imports look good.

These imports are properly utilized later on (e.g., tweenJs in pointermove logic, getSyncWorld for the demo world). No immediate issues spotted.


73-73: Callback array structure looks fine.

Declaring this array for unloading resources is a neat approach. No issues found.


79-84: Verify whether you need a fallback panorama.

Returning early if SINGLE_FILE_BUILD_MODE is set and the resource pack is unavailable means no panorama is loaded. If that’s intentional, it’s fine. Otherwise, consider a fallback.

 if (process.env.SINGLE_FILE_BUILD_MODE && !panoramaUsesResourcePack) {
   void initDemoWorld()
   return
+  // If you'd like a simple fallback, remove the `return` statement 
+  // and load a default panorama here instead.
 }
package.json (3)

13-14: Preparation and check-build refactor looks good.

Consolidating generation scripts into prepare-project before type checks is a clean approach.


41-41: Confirm release artifact glob support.

If your release tooling doesn’t expand globs, attaching multiple files by name might be necessary. Verify that {self-host.zip,minecraft.html} is accepted.


176-176: Adding “sharp” is a useful choice.

No issues found with introducing “sharp” as an optional dependency for image processing tasks.

src/optimizeJson.ts (1)

306-311: Confirm the side effects of dataRestorer usage.

Calling dataRestorer?.(data) after constructing data implies a side effect on it or external components. Ensure that any mutation is intentional and that the return value does not need to be reassigned to data.

@@ -3,7 +3,9 @@
import fs from 'fs'
import { WorldRendererThree } from 'renderer/viewer/lib/worldrendererThree'
import { enable, disable, enabled } from 'debug'
import { Vec3 } from 'vec3'
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

Exposing Vec3 to the global window object

Adding Vec3 to the global window object allows access from the browser console and other scripts without explicit imports. While this is useful for debugging, exposing library objects globally is generally not recommended for production code.

Also applies to: 8-8


🏁 Script executed:

#!/bin/bash
# Check if Vec3 is being used directly in other parts of the codebase
# that might now rely on global access instead of imports

# Look for imports of Vec3 that might have been removed
rg "import.*Vec3.*from ['\"]vec3['\"]" --type ts

# Look for usage of Vec3 without imports
rg "Vec3" --type ts | grep -v "import.*Vec3"

Length of output: 15329


Action: Remove or Gated Global Exposure of Vec3

The verification confirms that while almost every module continues to import Vec3 explicitly, the devtools (and an instance in the mesher module) still expose Vec3 on the global object (e.g., via window.Vec3 = Vec3 and globalThis.Vec3 = Vec3). This global assignment is useful for debugging but can lead to unintended use in production.

  • Files to review:
    • src/devtools.ts (line 8 shows window.Vec3 = Vec3)
    • renderer/playground/mesher/mesher.ts (global exposure via globalThis.Vec3 = Vec3)

Recommendation:
Either remove these global assignments or wrap them in a check (e.g., verifying that the environment is not production) so that the debugging convenience does not affect production stability.

Comment on lines +153 to +229
function processRecipes (current, prev, getData, version) {
// can require the same multiple times per different versions
if (current._proccessed) return
const items = getData('items')
const blocks = getData('blocks')
const itemsIdsMap = Object.fromEntries(items.map((b) => [b.id, b.name]))
const blocksIdsMap = Object.fromEntries(blocks.map((b) => [b.id, b.name]))
for (const key of Object.keys(current)) {
const mapId = (id) => {
if (typeof id !== 'string' && typeof id !== 'number') throw new Error('Incorrect type')
const mapped = itemsIdsMap[id] ?? blocksIdsMap[id]
if (!mapped) {
throw new Error(`No item/block name with id ${id}`)
}
return mapped
}
const processRecipe = (obj) => {
// if (!obj) return
// if (Array.isArray(obj)) {
// obj.forEach((id, i) => {
// obj[i] = mapId(obj[id])
// })
// } else if (obj && typeof obj === 'object') {
// if (!'count metadata id'.split(' ').every(x => x in obj)) {
// throw new Error(`process error: Unknown deep object pattern: ${JSON.stringify(obj)}`)
// }
// obj.id = mapId(obj.id)
// } else {
// throw new Error('unknown type')
// }
const parseRecipeItem = (item) => {
if (typeof item === 'number') return mapId(item)
if (Array.isArray(item)) return [mapId(item), ...item.slice(1)]
if (!item) {
return item
}
if ('id' in item) {
item.id = mapId(item.id)
return item
}
throw new Error('unhandled')
}
const maybeProccessShape = (shape) => {
if (!shape) return
for (const shapeRow of shape) {
for (const [i, item] of shapeRow.entries()) {
shapeRow[i] = parseRecipeItem(item)
}
}
}
if (obj.result) obj.result = parseRecipeItem(obj.result)
maybeProccessShape(obj.inShape)
maybeProccessShape(obj.outShape)
if (obj.ingredients) {
for (const [i, ingredient] of obj.ingredients.entries()) {
obj.ingredients[i] = parseRecipeItem(ingredient)
}
}
}
try {
const name = mapId(key)
for (const [i, recipe] of current[key].entries()) {
try {
processRecipe(recipe)
} catch (err) {
console.warn(`${version} [warn] Removing incorrect recipe: ${err}`)
delete current[i]
}
}
current[name] = current[key]
} catch (err) {
console.warn(`${version} [warn] Removing incorrect recipe: ${err}`)
}
delete current[key]
}
current._proccessed = true
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Fix incorrect recipe parsing logic.

There are two critical issues:

  1. Mapping array items uses mapId(item) instead of mapId(item[0]).
  2. Deleting an entry in the loop calls delete current[i] instead of delete current[key][i].
- if (Array.isArray(item)) return [mapId(item), ...item.slice(1)]
+ if (Array.isArray(item)) return [mapId(item[0]), ...item.slice(1)]

...

- delete current[i]
+ delete current[key][i]
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
function processRecipes (current, prev, getData, version) {
// can require the same multiple times per different versions
if (current._proccessed) return
const items = getData('items')
const blocks = getData('blocks')
const itemsIdsMap = Object.fromEntries(items.map((b) => [b.id, b.name]))
const blocksIdsMap = Object.fromEntries(blocks.map((b) => [b.id, b.name]))
for (const key of Object.keys(current)) {
const mapId = (id) => {
if (typeof id !== 'string' && typeof id !== 'number') throw new Error('Incorrect type')
const mapped = itemsIdsMap[id] ?? blocksIdsMap[id]
if (!mapped) {
throw new Error(`No item/block name with id ${id}`)
}
return mapped
}
const processRecipe = (obj) => {
// if (!obj) return
// if (Array.isArray(obj)) {
// obj.forEach((id, i) => {
// obj[i] = mapId(obj[id])
// })
// } else if (obj && typeof obj === 'object') {
// if (!'count metadata id'.split(' ').every(x => x in obj)) {
// throw new Error(`process error: Unknown deep object pattern: ${JSON.stringify(obj)}`)
// }
// obj.id = mapId(obj.id)
// } else {
// throw new Error('unknown type')
// }
const parseRecipeItem = (item) => {
if (typeof item === 'number') return mapId(item)
if (Array.isArray(item)) return [mapId(item), ...item.slice(1)]
if (!item) {
return item
}
if ('id' in item) {
item.id = mapId(item.id)
return item
}
throw new Error('unhandled')
}
const maybeProccessShape = (shape) => {
if (!shape) return
for (const shapeRow of shape) {
for (const [i, item] of shapeRow.entries()) {
shapeRow[i] = parseRecipeItem(item)
}
}
}
if (obj.result) obj.result = parseRecipeItem(obj.result)
maybeProccessShape(obj.inShape)
maybeProccessShape(obj.outShape)
if (obj.ingredients) {
for (const [i, ingredient] of obj.ingredients.entries()) {
obj.ingredients[i] = parseRecipeItem(ingredient)
}
}
}
try {
const name = mapId(key)
for (const [i, recipe] of current[key].entries()) {
try {
processRecipe(recipe)
} catch (err) {
console.warn(`${version} [warn] Removing incorrect recipe: ${err}`)
delete current[i]
}
}
current[name] = current[key]
} catch (err) {
console.warn(`${version} [warn] Removing incorrect recipe: ${err}`)
}
delete current[key]
}
current._proccessed = true
}
function processRecipes (current, prev, getData, version) {
// can require the same multiple times per different versions
if (current._proccessed) return
const items = getData('items')
const blocks = getData('blocks')
const itemsIdsMap = Object.fromEntries(items.map((b) => [b.id, b.name]))
const blocksIdsMap = Object.fromEntries(blocks.map((b) => [b.id, b.name]))
for (const key of Object.keys(current)) {
const mapId = (id) => {
if (typeof id !== 'string' && typeof id !== 'number') throw new Error('Incorrect type')
const mapped = itemsIdsMap[id] ?? blocksIdsMap[id]
if (!mapped) {
throw new Error(`No item/block name with id ${id}`)
}
return mapped
}
const processRecipe = (obj) => {
// if (!obj) return
// if (Array.isArray(obj)) {
// obj.forEach((id, i) => {
// obj[i] = mapId(obj[id])
// })
// } else if (obj && typeof obj === 'object') {
// if (!'count metadata id'.split(' ').every(x => x in obj)) {
// throw new Error(`process error: Unknown deep object pattern: ${JSON.stringify(obj)}`)
// }
// obj.id = mapId(obj.id)
// } else {
// throw new Error('unknown type')
// }
const parseRecipeItem = (item) => {
if (typeof item === 'number') return mapId(item)
if (Array.isArray(item)) return [mapId(item[0]), ...item.slice(1)]
if (!item) {
return item
}
if ('id' in item) {
item.id = mapId(item.id)
return item
}
throw new Error('unhandled')
}
const maybeProccessShape = (shape) => {
if (!shape) return
for (const shapeRow of shape) {
for (const [i, item] of shapeRow.entries()) {
shapeRow[i] = parseRecipeItem(item)
}
}
}
if (obj.result) obj.result = parseRecipeItem(obj.result)
maybeProccessShape(obj.inShape)
maybeProccessShape(obj.outShape)
if (obj.ingredients) {
for (const [i, ingredient] of obj.ingredients.entries()) {
obj.ingredients[i] = parseRecipeItem(ingredient)
}
}
}
try {
const name = mapId(key)
for (const [i, recipe] of current[key].entries()) {
try {
processRecipe(recipe)
} catch (err) {
console.warn(`${version} [warn] Removing incorrect recipe: ${err}`)
delete current[key][i]
}
}
current[name] = current[key]
} catch (err) {
console.warn(`${version} [warn] Removing incorrect recipe: ${err}`)
}
delete current[key]
}
current._proccessed = true
}

@@ -9,7 +9,9 @@
"start2": "run-p dev-rsbuild watch-mesher",
"build": "pnpm build-other-workers && rsbuild build",
"build-analyze": "BUNDLE_ANALYZE=true rsbuild build && pnpm build-other-workers",
"check-build": "tsx scripts/genShims.ts && tsc && pnpm build",
"build-single-file": "SINGLE_FILE_BUILD=true rsbuild build",
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Fix environment variable mismatch.

Your code checks SINGLE_FILE_BUILD_MODE, but the script sets SINGLE_FILE_BUILD. This mismatch will prevent logic from running as intended.

- "build-single-file": "SINGLE_FILE_BUILD=true rsbuild build",
+ "build-single-file": "SINGLE_FILE_BUILD_MODE=true rsbuild build",
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
"build-single-file": "SINGLE_FILE_BUILD=true rsbuild build",
"build-single-file": "SINGLE_FILE_BUILD_MODE=true rsbuild build",

Comment on lines +97 to +100
} catch (err) {
// console.warn(`${version} [warn] Removing incorrect recipe: ${err}`)
throw err
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Remove or refine the outer try/catch that only rethrows.

Similarly, this outer catch block does not provide additional handling. Either add unique handling logic or remove it.

     } catch (err) {
       // console.warn(`${version} [warn] Removing incorrect recipe: ${err}`)
-      throw err
+      // Log or remove this block if not needed
     }

Committable suggestion skipped: line range outside the PR's diff.

🧰 Tools
🪛 Biome (1.9.4)

[error] 99-99: The catch clause that only rethrows the original error is useless.

An unnecessary catch clause can be confusing.
Unsafe fix: Remove the try/catch clause.

(lint/complexity/noUselessCatch)

Comment on lines +88 to +93
try {
processRecipe(recipe)
} catch (err) {
// console.warn(`${version} [warn] Removing incorrect recipe: ${err}`)
// delete current[i]
throw err
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Remove or refine the inner try/catch that only rethrows.

This catch block merely rethrows the error, making it a “no-useless-catch” case. If you want to log or handle the error specifically, add relevant logic; otherwise, consider removing the catch for cleaner code.

       try {
         processRecipe(recipe)
       } catch (err) {
-        throw err
+        // Log the error or remove the catch block
       }

Committable suggestion skipped: line range outside the PR's diff.

🧰 Tools
🪛 Biome (1.9.4)

[error] 93-93: The catch clause that only rethrows the original error is useless.

An unnecessary catch clause can be confusing.
Unsafe fix: Remove the try/catch clause.

(lint/complexity/noUselessCatch)

@zardoy zardoy merged commit b8b1320 into next Mar 11, 2025
2 of 3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

as I am typing this, my world is still loading chunks
1 participant