Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(server): Import or Export Project Functionality (File resource) #1151

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

hexaforce
Copy link
Contributor

@hexaforce hexaforce commented Sep 20, 2024

Overview

Changed from the previous JSON data-only output to downloading as a ZIP file that includes file resources.

What I've done

  1. Combine JSON files and file resources into a ZIP file for storage.
    Contents of the ZIP file:
xxxx.reearth(Zip)
├── assets
│   ├── [project img]
│   ├── [nlsLayer file resources]
│   ├── [widget button icon]
│   ├── [page block src]
│   └── ...
├── plugins
│   └── [Plugin ID]
│       ├── [Plugin file]
│       └── ...
└── project.json

*project.json is the previous JSON file.

  1. Add the export domain to the ZIP file storage.
    *Files will be deleted simultaneously with the download.

  2. Change the response from JSON to the download path of the ZIP file.

  3. Added processing to save file resources.
    File names will be updated.

  4. Modified the response upon import to return the database data for verification.

  5. Refactor part of the previous code.
    In ExportProject(), data for Scenes and Plugins was being obtained, but I have now created ExportScene() and ExportPlugins() to separate the processing.

What I haven't done

The permission checks, such as access rights for each process, are not sufficient.

How I tested

Which point I want you to review particularly

Memo

Summary by CodeRabbit

Release Notes

  • New Features

    • Introduced functionality for exporting and importing projects as ZIP files.
    • Added GraphQL mutations for project export and import, enhancing API capabilities.
    • New endpoint for downloading exported project files.
  • Improvements

    • Updated response structure for project data exports to return file paths instead of data.
    • Enhanced project management with streamlined export and import processes.
    • Improved handling of asset uploads from ZIP archives.
  • Bug Fixes

    • Improved error handling for file operations during export and import.

Copy link

netlify bot commented Sep 20, 2024

Deploy Preview for reearth-web canceled.

Name Link
🔨 Latest commit 9f67a8d
🔍 Latest deploy log https://app.netlify.com/sites/reearth-web/deploys/66ee90bc82dad5000811982d

Copy link

coderabbitai bot commented Sep 20, 2024

Walkthrough

The changes introduce significant modifications to the project's GraphQL API and related functionalities, focusing on the export and import of project data. The ExportProjectPayload schema is updated to return a path to the exported data instead of the data itself. New methods for handling ZIP files are added across various components, enhancing the ability to export and import project assets. Additionally, several tests are streamlined or removed to reflect the updated functionality, ensuring the test suite aligns with the new focus on export capabilities.

Changes

File Path Change Summary
server/e2e/gql_import_export_test.go Modified TestCallExportProject to request projectDataPath instead of projectData. Removed TestCallImportProject.
server/gql/project.graphql Changed projectData field to projectDataPath, altering its type from JSON to String in ExportProjectPayload.
server/internal/adapter/gql/generated.go Renamed ProjectData to ProjectDataPath in ExportProjectPayload and updated related resolver logic.
server/internal/adapter/gql/gqlmodel/models_gen.go Changed ProjectData to ProjectDataPath from JSON to string in ExportProjectPayload.
server/internal/adapter/gql/resolver_mutation_project.go Enhanced ExportProject and ImportProject methods for ZIP file handling, including reading and writing project data.
server/internal/app/file.go Added a new endpoint for exporting files via a GET route at /export/:filename.
server/internal/infrastructure/fs/common.go Introduced exportDir constant for organizing exported resources.
server/internal/infrastructure/fs/file.go Added methods for reading, uploading, and removing exported project ZIP files.
server/internal/infrastructure/gcs/file.go Added methods for handling export project ZIP files, including reading, uploading, and removing. Introduced gcsExportBasePath constant.
server/internal/infrastructure/s3/s3.go Similar to GCS, added methods for managing export project ZIP files in S3.
server/internal/usecase/gateway/file.go Introduced methods for reading, uploading, and removing export project ZIP files in the file interface.
server/internal/usecase/interactor/asset.go Added UploadAssetFile method for uploading asset files from a ZIP archive.
server/internal/usecase/interactor/nlslayer.go Modified ImportNLSLayers to collect existing NLS layer IDs for improved data integrity.
server/internal/usecase/interactor/plugin.go Added methods for exporting and importing plugins, enhancing plugin management.
server/internal/usecase/interactor/project.go Restructured ExportProject and UploadExportProjectZip functions for better modularity in exporting project data.
server/internal/usecase/interactor/scene.go Introduced ExportScene method for exporting scene data into a ZIP format.
server/internal/usecase/interactor/storytelling.go Enhanced ImportStory to ensure saved stories can be retrieved accurately.
server/internal/usecase/interactor/style.go Modified ImportStyles to retrieve saved styles by their IDs for improved reliability.
server/internal/usecase/interfaces/asset.go Added UploadAssetFile method to the Asset interface for handling zipped asset data.
server/internal/usecase/interfaces/plugin.go Added methods for exporting plugins and importing plugin files.
server/internal/usecase/interfaces/project.go Modified ExportProject to accept a *zip.Writer and added UploadExportProjectZip method for handling ZIP uploads.
server/internal/usecase/interfaces/scene.go Added ExportScene method to the Scene interface for exporting scene data.
web/src/services/gql/__gen__/graphql.ts Introduced new types and mutations for project export and import functionalities, including ExportProjectInput, ExportProjectPayload, ImportProjectInput, and ImportProjectPayload.

Possibly related PRs

Suggested reviewers

  • pyshx
  • airslice

Poem

In the fields where data flows,
A rabbit hops where the project grows.
With paths to treasures, zipped up tight,
We share our work, oh what a sight!
From imports grand to exports bright,
Let's celebrate this coding delight! 🐇✨


Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

Share
Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    -- I pushed a fix in commit <commit_id>, please review it.
    -- Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    -- @coderabbitai generate unit testing code for this file.
    -- @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    -- @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    -- @coderabbitai read src/utils.ts and generate unit testing code.
    -- @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    -- @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

@hexaforce hexaforce force-pushed the feat/project-import-export-file-resource branch from 736750f to 2824545 Compare September 20, 2024 04:36
@hexaforce hexaforce force-pushed the feat/project-import-export-file-resource branch from 2824545 to 11fed7c Compare September 20, 2024 04:38
@hexaforce hexaforce marked this pull request as ready for review September 20, 2024 04:40
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 25

Outside diff range and nitpick comments (8)
server/e2e/gql_import_export_test.go (1)

49-53: Note: Potential future implementation.

The commented-out section suggests a potential future implementation where the response body is downloaded and processed. As mentioned in the AI-generated summary, this functionality is not currently active.

Consider removing the commented-out code if it's not planned for implementation in the near future, or add a TODO comment to clarify the intended functionality and timeline.

server/internal/usecase/interactor/plugin.go (2)

Line range hint 112-162: Simplify plugin import by batching repository operations

In the ImportPlugins function, each plugin is being saved individually within the loop, which can be inefficient due to multiple database operations. Consider collecting all non-system plugins and saving them in a batch after the loop to improve performance.

Refactor the code as follows:

  var pluginsToSave []*plugin.Plugin
  for _, pluginJSON := range pluginsJSON {
      // ... [existing code] ...

      if !p.ID().System() {
-         if err := i.pluginRepo.Save(ctx, p); err != nil {
-             return nil, err
-         }
+         pluginsToSave = append(pluginsToSave, p)
      }
  }

+ // Save all plugins in a batch
+ if len(pluginsToSave) > 0 {
+     if err := i.pluginRepo.SaveAll(ctx, pluginsToSave); err != nil {
+         return nil, err
+     }
+ }

This change reduces the number of database calls and can improve the performance of the import operation.


62-107: Handle errors more explicitly in the ExportPlugins function

While the current error handling passes the error up the call stack, adding context to the errors can make debugging easier. Use fmt.Errorf or wrap errors to provide more detailed messages.

For example:

  plgs, err := i.pluginRepo.FindByIDs(ctx, filteredPluginIDs)
  if err != nil {
-     return nil, err
+     return nil, fmt.Errorf("failed to find plugins by IDs: %w", err)
  }

  // Similarly for other errors within the function
  zipEntry, err := zipWriter.Create(zipEntryPath)
  if err != nil {
-     return nil, err
+     return nil, fmt.Errorf("failed to create zip entry '%s': %w", zipEntryPath, err)
  }

This provides more context if an error occurs, aiding in troubleshooting.

server/internal/usecase/interactor/project.go (1)

622-625: Re-fetching the project may be unnecessary

After importing and saving the project, fetching it again using FindByID might be redundant if there are no additional changes or side effects that need to be captured. Consider returning the project object directly to improve performance.

server/internal/usecase/interactor/nlslayer.go (2)

Line range hint 846-868: Potential conflict when creating layers with existing IDs

The function creates new NLSLayer instances using IDs parsed from the input JSON without checking if those IDs already exist in the repository. This could lead to overwriting existing layers or unexpected behavior if an NLSLayer with the same ID already exists. Consider adding a check to determine if a layer with the same ID exists before creating a new instance or handle the possibility of updating existing layers.


879-883: Unnecessary repository query after saving layers

After saving the new NLSLayer instances, the function queries the repository to fetch them again using their IDs. Since the layers are already available in the nlayers slice, this may introduce unnecessary overhead. Consider returning the nlayers directly unless there's a specific reason to refetch from the repository.

server/internal/usecase/interactor/scene.go (2)

862-878: Use defer to ensure stream is closed properly

Currently, stream.Close() is called manually after io.Copy, and in one error case. Using defer guarantees that stream.Close() is called, even if an error occurs during io.Copy.

Modify the code to use defer:

    stream, err := i.file.ReadAsset(ctx, fileName)
    if err != nil {
        return err
    }
+   defer stream.Close()

    zipEntryPath := fmt.Sprintf("assets/%s", fileName)
    zipEntry, err := zipWriter.Create(zipEntryPath)
    if err != nil {
        return err
    }
    _, err = io.Copy(zipEntry, stream)
    if err != nil {
        return err
    }
-   if err := stream.Close(); err != nil {
-       return err
-   }

826-829: Redundant scene fetching after import

After importing the scene, you are fetching it again from the repository at lines 826-829. Since you already have the scene object, this may not be necessary unless you need to refresh its state.

Consider removing the redundant fetch to optimize performance:

-   scene, err = i.sceneRepo.FindByID(ctx, sceneID)
-   if err != nil {
-       return nil, err
-   }
Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL

Commits

Files that changed from the base of the PR and between 13e4f7e and 11fed7c.

Files selected for processing (23)
  • server/e2e/gql_import_export_test.go (2 hunks)
  • server/gql/project.graphql (1 hunks)
  • server/internal/adapter/gql/generated.go (8 hunks)
  • server/internal/adapter/gql/gqlmodel/models_gen.go (1 hunks)
  • server/internal/adapter/gql/resolver_mutation_project.go (4 hunks)
  • server/internal/app/file.go (1 hunks)
  • server/internal/infrastructure/fs/common.go (1 hunks)
  • server/internal/infrastructure/fs/file.go (1 hunks)
  • server/internal/infrastructure/gcs/file.go (4 hunks)
  • server/internal/infrastructure/s3/s3.go (4 hunks)
  • server/internal/usecase/gateway/file.go (2 hunks)
  • server/internal/usecase/interactor/asset.go (3 hunks)
  • server/internal/usecase/interactor/nlslayer.go (3 hunks)
  • server/internal/usecase/interactor/plugin.go (3 hunks)
  • server/internal/usecase/interactor/project.go (3 hunks)
  • server/internal/usecase/interactor/scene.go (7 hunks)
  • server/internal/usecase/interactor/storytelling.go (1 hunks)
  • server/internal/usecase/interactor/style.go (3 hunks)
  • server/internal/usecase/interfaces/asset.go (2 hunks)
  • server/internal/usecase/interfaces/plugin.go (2 hunks)
  • server/internal/usecase/interfaces/project.go (2 hunks)
  • server/internal/usecase/interfaces/scene.go (2 hunks)
  • web/src/services/gql/gen/graphql.ts (5 hunks)
Additional context used
Learnings (1)
server/e2e/gql_import_export_test.go (2)
Learnt from: hexaforce
PR: reearth/reearth-visualizer#1141
File: server/e2e/gql_import_export_test.go:48-80
Timestamp: 2024-09-13T01:19:11.267Z
Learning: In `server/e2e/gql_import_export_test.go`, test functions like `TestCallImportProject` may have disabled assertions due to problems with GraphQL.
Learnt from: hexaforce
PR: reearth/reearth-visualizer#1141
File: server/e2e/gql_import_export_test.go:12-44
Timestamp: 2024-09-13T01:19:05.869Z
Learning: The response body assertions in `TestCallExportProject` are disabled due to existing problems with GraphQL.
Additional comments not posted (35)
server/internal/adapter/gql/generated.go (8)

342-342: LGTM!

The field name change from ProjectData to ProjectDataPath in the ExportProjectPayload struct is consistent with the PR objectives and the AI-generated summary. The field type remains unchanged, indicating that the complexity calculation logic for the field is unaffected.


2591-2596: LGTM!

The changes in the complexity calculation logic align with the field name change from ProjectData to ProjectDataPath in the ExportProjectPayload struct. The complexity calculation for the field remains unchanged.


9614-9614: LGTM!

The field name change from projectData to projectDataPath and the field type change from JSON! to String! in the ExportProjectPayload type are consistent with the PR objectives and the AI-generated summary.


19093-19094: LGTM!

The addition of the new resolver function _ExportProjectPayload_projectDataPath for the projectDataPath field in the ExportProjectPayload type is consistent with the field name change. The function follows the same pattern as other resolver functions in the file.


19107-19107: LGTM!

The resolver function _ExportProjectPayload_projectDataPath returns the value of obj.ProjectDataPath, which is consistent with the field name change in the ExportProjectPayload struct.


19119-19121: LGTM!

The changes in the resolver function _ExportProjectPayload_projectDataPath, including casting the resolved value to a string, assigning it to fc.Result, and marshaling it using the marshalNString2string function, are consistent with the field type change from JSON! to String! in the ExportProjectPayload type.

The addition of the new field context function fieldContext_ExportProjectPayload_projectDataPath for the projectDataPath field is consistent with the field name change. The function correctly returns an error if the field has child fields, as the field type is String.

Also applies to: 19124-19131


31822-31823: LGTM!

The changes in the switch statement, including the removal of the case for "projectData" and the addition of the case for "projectDataPath", are consistent with the field name change in the ExportProjectPayload type. The new case correctly calls the fieldContext_ExportProjectPayload_projectDataPath function to retrieve the field context for the projectDataPath field.


65749-65750: LGTM!

The changes in the switch statement, including the removal of the case for "projectData" and the addition of the case for "projectDataPath", are consistent with the field name change in the ExportProjectPayload type. The new case correctly calls the _ExportProjectPayload_projectDataPath function to retrieve the value of the projectDataPath field.

web/src/services/gql/__gen__/graphql.ts (5)

559-566: LGTM!

The new ExportProjectInput and ExportProjectPayload types are well-defined and follow the GraphQL schema design principles. They enable the project export functionality as described in the PR objectives.


621-628: LGTM!

The new ImportProjectInput and ImportProjectPayload types are well-defined and follow the GraphQL schema design principles. They enable the project import functionality as described in the PR objectives.


1040-1044: LGTM!

The new exportProject and importProject mutation fields are correctly added to the Mutation type. They expose the project export and import functionalities through the GraphQL API, with the return types matching the respective payload types.


1288-1290: LGTM!

The new MutationExportProjectArgs type correctly defines the input arguments for the exportProject mutation. It expects an ExportProjectInput object as input and follows the GraphQL schema design principles.


1308-1310: LGTM!

The new MutationImportProjectArgs type correctly defines the input arguments for the importProject mutation. It expects an ImportProjectInput object as input and follows the GraphQL schema design principles.

server/internal/infrastructure/fs/common.go (1)

8-8: LGTM! The addition of exportDir constant enhances the file system structure.

The new constant exportDir with the value "export" is a straightforward addition that does not alter existing logic or functionality. It represents a directory path for export-related functionalities, suggesting an enhancement in the file system structure for better organization and management of exported resources.

This change is unlikely to have any negative impact on the existing codebase as it is a new addition and does not modify existing code.

server/internal/usecase/interfaces/plugin.go (1)

25-25: LGTM!

The ExportPlugins method signature aligns with the PR objective of exporting plugins associated with a given scene. The parameters and return values are appropriate for the intended functionality.

server/internal/usecase/interfaces/asset.go (2)

4-4: LGTM!

The import of the archive/zip package is valid and necessary for the changes introduced in this file.


39-39: Looks good! This addition enhances the Asset interface.

The new UploadAssetFile method is a valuable addition to the Asset interface, as it enables the uploading of asset files, which is crucial for the import/export functionality being introduced in this PR.

The method signature and return types are appropriate:

  • The context.Context parameter allows for request-scoped values and cancellation.
  • The string parameter likely represents the file name or path.
  • The *zip.File parameter indicates that the method will handle operations related to zip file handling, aligning with the goal of using ZIP files for project data import/export.
  • The *url.URL return type suggests that the method will return a URL pointing to the uploaded asset, providing a way to access the uploaded file.
  • The int64 return type likely represents the file size, which can be useful metadata.
  • The error return type allows for proper error handling.

Overall, this change expands the capabilities of the Asset interface, enabling more complex asset management operations and supporting the import/export functionality being introduced in this PR.

server/e2e/gql_import_export_test.go (2)

29-29: LGTM!

The change in the GraphQL mutation query aligns with the shift in the expected response structure, as mentioned in the AI-generated summary. The test now focuses on validating the path to the exported project data rather than the data itself.


Line range hint 35-47: LGTM!

The change in the test's response handling aligns with the shift in the expected response structure, as mentioned in the AI-generated summary. The test now extracts a string value from projectDataPath instead of an object from projectData.

Note: As per the learnings, some assertions in the test functions are disabled due to problems with GraphQL. Keep this in mind when reviewing the test results.

server/internal/usecase/gateway/file.go (3)

8-8: LGTM!

The import statement for the os package is correctly placed and suggests that the code might involve file or directory related operations.


40-40: LGTM!

The ReadExportProjectZip method is a good addition to the File interface. It expands the interface's capabilities to support reading of project export ZIP files. The method signature is clear, follows Go conventions, and appropriately uses context.Context for request-scoped data.


41-42: LGTM!

The UploadExportProjectZip and RemoveExportProjectZip methods are good additions to the File interface. They expand the interface's capabilities to support uploading and removing of project export ZIP files. The method signatures are clear, follow Go conventions, and appropriately use context.Context for request-scoped data. Using *os.File as a parameter in UploadExportProjectZip is a good choice as it allows access to file metadata if needed.

server/internal/app/file.go (1)

49-61: LGTM!

The new route handler for exporting project files looks good. It follows the standard pattern of using a fileHandler function to stream the file response. The error handling is done correctly by returning ErrNotFound if the file is not found. The removal of the file after reading is a good practice to manage the lifecycle of exported files and ensure they are downloaded only once. The content type of the response is determined based on the file extension, which is a common approach.

Overall, the code segment is well-structured, follows best practices, and enhances the server's capabilities by allowing clients to download exported project files.

server/internal/usecase/interfaces/scene.go (2)

4-4: LGTM!

The archive/zip package import is necessary to support the new ExportScene method.


35-35: Approve the addition of ExportScene method to the Scene interface.

The method signature is well-defined and follows the existing convention in the interface. The addition of this method enhances the functionality of the Scene interface by enabling the exportation of scene data.

Please ensure that the implementation of this method is thoroughly reviewed to verify correctness, error handling, and performance.

server/gql/project.graphql (1)

119-119: Verify the impact on clients and provide necessary documentation.

The change from projectData to projectDataPath aligns with the PR objective of exporting project data as a ZIP file. However, this change may introduce breaking changes for clients that expect the project data directly in the response.

Please ensure that:

  1. All clients consuming this GraphQL API are updated to handle the new response format and retrieve the project data from the provided file path.
  2. Necessary documentation or migration guides are provided to assist clients in adapting to this change.
server/internal/usecase/interactor/asset.go (1)

130-152: LGTM! The new UploadAssetFile function is a valuable addition.

The UploadAssetFile function enhances the functionality of the Asset struct by enabling the upload of files directly from a zip archive. This streamlines asset management processes by allowing users to upload assets in bulk.

The function is well-implemented with proper error handling and resource management. It follows the existing coding conventions and integrates seamlessly with the rest of the codebase.

Some key points:

  • It handles errors when opening the zip file entry.
  • It defers the closing of the read stream to ensure proper resource management.
  • It detects the content type using the http.DetectContentType function.
  • It constructs a file.File object with the necessary properties.
  • It calls the UploadAsset method and returns the result.

Great job on this addition!

server/internal/usecase/interactor/style.go (1)

Line range hint 214-244: LGTM! The changes improve the reliability of the imported styles.

The introduction of the styleIDs variable to collect style IDs during the import process and the subsequent retrieval of the saved styles by their IDs using i.styleRepo.FindByIDs is a good approach.

This change ensures that the function returns the styles that were actually saved to the repository, rather than the initially created styleList. This allows for additional processing or validation of the saved styles if needed in the future.

The altered control flow, which now includes the additional step of fetching the saved styles, improves the reliability of the function's output.

server/internal/infrastructure/fs/file.go (1)

128-130: LGTM!

The ReadExportProjectZip method implementation looks good:

  • It constructs the file path correctly by joining the exportDir and sanitized filename.
  • It uses the read helper method to handle file reading and error checking.
  • The filename parameter is properly sanitized using sanitize.Path to prevent path traversal attacks.
server/internal/infrastructure/s3/s3.go (2)

32-32: LGTM!

The constant exportBasePath is appropriately named and defined for handling export-related files.


217-223: LGTM!

The ReadExportProjectZip method is implemented correctly:

  • It sanitizes the input to prevent potential security issues.
  • It handles the case when the name is empty by returning an appropriate error.
  • It constructs the correct file path using the exportBasePath constant.
  • It calls the read method with the correct file path to read the ZIP file.
server/internal/infrastructure/gcs/file.go (1)

208-214: LGTM!

The ReadExportProjectZip function is implemented correctly:

  • It sanitizes the input name parameter to prevent potential path traversal attacks.
  • It handles the case when the sanitized name is empty by returning a not found error.
  • It constructs the correct path for the export project ZIP file.
  • It delegates the actual reading of the file to the read helper function.
server/internal/usecase/interactor/storytelling.go (1)

1127-1130: Excellent improvement to the ImportStory function!

Retrieving the story from the repository after saving it is a great way to ensure data integrity. This change enhances the reliability of the function by:

  1. Verifying that the story was successfully persisted and can be fetched from the database accurately. It protects against silent or partial failures during the save operation.

  2. Returning the retrieved story object instead of the original ensures that any database generated fields (like timestamps) are populated correctly in the response.

  3. Failing early if the retrieval doesn't succeed, indicating a problem with the save operation or the repository itself.

Overall, this makes the function more robust and fail-safe. Great work!

server/internal/adapter/gql/gqlmodel/models_gen.go (1)

566-566: Modify ExportProjectPayload to return project data file path instead of JSON data.

The ProjectData field in the ExportProjectPayload struct has been changed from JSON type to string type and renamed to ProjectDataPath. This suggests that instead of directly including the project data in the GraphQL response, the response will now contain a file path or URL pointing to the location of the exported project data file (likely in ZIP format).

This change will require updates to the client-side code to handle downloading and extracting the project data from the file, rather than directly accessing it from the GraphQL response. The server-side implementation will need to ensure that the project data is correctly exported to a file and the ProjectDataPath is set to a valid and accessible location.

server/internal/usecase/interactor/plugin.go (1)

157-163: Confirm that all imported plugins are properly retrieved

After importing plugins, you fetch them using FindByIDs. Ensure that all plugins are successfully retrieved and handle cases where some plugins might not be found.

Run the following script to confirm that all plugin IDs retrieved match the ones intended to import:

This script assumes you have access to commands that can list the plugin IDs from the repository. Adjust it accordingly to fit your environment.

ImportPlugins(context.Context, []interface{}) ([]*plugin.Plugin, error)
ImporPluginFile(context.Context, id.PluginID, string, *zip.File) error
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please clarify the purpose of the string parameter.

The ImportPluginFile method signature aligns with the PR objective of importing a plugin file. However, the purpose of the string parameter is unclear. Please provide a comment or rename the parameter to clarify its intended use.

Comment on lines +132 to +135
func (f *fileRepo) UploadExportProjectZip(ctx context.Context, zipFile *os.File) error {
_, err := f.upload(ctx, path.Join(exportDir, zipFile.Name()), zipFile)
return err
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sanitize the zipFile.Name() before constructing the destination file path.

The zipFile.Name() is used directly to construct the destination file path without any sanitization. This could potentially lead to path traversal attacks if the zipFile name is maliciously crafted.

Consider applying this diff to sanitize the zipFile.Name():

func (f *fileRepo) UploadExportProjectZip(ctx context.Context, zipFile *os.File) error {
-	_, err := f.upload(ctx, path.Join(exportDir, zipFile.Name()), zipFile)
+	_, err := f.upload(ctx, path.Join(exportDir, sanitize.Path(zipFile.Name())), zipFile)
	return err
}
Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
func (f *fileRepo) UploadExportProjectZip(ctx context.Context, zipFile *os.File) error {
_, err := f.upload(ctx, path.Join(exportDir, zipFile.Name()), zipFile)
return err
}
func (f *fileRepo) UploadExportProjectZip(ctx context.Context, zipFile *os.File) error {
_, err := f.upload(ctx, path.Join(exportDir, sanitize.Path(zipFile.Name())), zipFile)
return err
}

Comment on lines +137 to +139
func (f *fileRepo) RemoveExportProjectZip(ctx context.Context, filename string) error {
return f.delete(ctx, filepath.Join(exportDir, filename))
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sanitize the filename before constructing the file path.

The filename parameter is used directly to construct the file path without any sanitization. This could potentially lead to path traversal attacks if the filename is maliciously crafted.

Consider applying this diff to sanitize the filename:

func (f *fileRepo) RemoveExportProjectZip(ctx context.Context, filename string) error {
-	return f.delete(ctx, filepath.Join(exportDir, filename))
+	return f.delete(ctx, filepath.Join(exportDir, sanitize.Path(filename)))
}

Committable suggestion was skipped due to low confidence.

Comment on lines +230 to +232
func (f *fileRepo) RemoveExportProjectZip(ctx context.Context, filename string) error {
return f.delete(ctx, path.Join(exportBasePath, filename))
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sanitize the filename before using it in the file path.

The RemoveExportProjectZip method directly uses the filename as part of the file path without sanitizing it. This could potentially lead to security issues if the filename is not trusted.

Consider applying this change:

-	return f.delete(ctx, path.Join(exportBasePath, filename))
+	sanitizedFilename := sanitize.Path(filename)
+	return f.delete(ctx, path.Join(exportBasePath, sanitizedFilename))

Committable suggestion was skipped due to low confidence.

Comment on lines +225 to +228
func (f *fileRepo) UploadExportProjectZip(ctx context.Context, zipFile *os.File) error {
_, err := f.upload(ctx, path.Join(exportBasePath, zipFile.Name()), zipFile)
return err
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sanitize the zipFile.Name() before using it in the file path.

The UploadExportProjectZip method directly uses the zipFile.Name() as part of the file path without sanitizing it. This could potentially lead to security issues if the file name is not trusted.

Consider applying this change:

-	_, err := f.upload(ctx, path.Join(exportBasePath, zipFile.Name()), zipFile)
+	sanitizedName := sanitize.Path(zipFile.Name())
+	_, err := f.upload(ctx, path.Join(exportBasePath, sanitizedName), zipFile)

Committable suggestion was skipped due to low confidence.

Comment on lines +533 to +561
fileWriter, err := zipWriter.Create("project.json")
if err != nil {
return nil, nil, nil, err
return err
}
layerStyles, err := i.layerStyles.FindByScene(ctx, sceneID)
jsonData, err := json.Marshal(data)
if err != nil {
return nil, nil, nil, err
return err
}
storyList, err := i.storytellingRepo.FindByScene(ctx, sceneID)
if err != nil {
return nil, nil, nil, err
}
sceneJSON, err := builder.New(
repo.LayerLoaderFrom(i.layerRepo),
repo.PropertyLoaderFrom(i.propertyRepo),
repo.DatasetGraphLoaderFrom(i.datasetRepo),
repo.TagLoaderFrom(i.tagRepo),
repo.TagSceneLoaderFrom(i.tagRepo, []id.SceneID{sceneID}),
repo.NLSLayerLoaderFrom(i.nlsLayerRepo),
true,
).ForScene(sce).WithNLSLayers(&nlsLayers).WithLayerStyle(layerStyles).WithStory((*storyList)[0]).BuildResult(
ctx,
time.Now(),
prj.CoreSupport(),
prj.EnableGA(),
prj.TrackingID(),
)
if _, err = fileWriter.Write(jsonData); err != nil {
return err
}

if err := zipWriter.Close(); err != nil {
return err
}

// flush once
if err := zipFile.Close(); err != nil {
return err
}
zipFile, err = os.Open(zipFile.Name())
if err != nil {
return nil, nil, nil, err
return err
}

res := make(map[string]interface{})
res["scene"] = sceneJSON
return prj, res, plgs, nil
if err := i.file.UploadExportProjectZip(ctx, zipFile); err != nil {
return err
}
return nil
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Close zipFile after uploading to prevent resource leaks

After uploading the zipFile using i.file.UploadExportProjectZip, the file remains open, which could lead to resource leaks. Ensure that zipFile is closed after the upload operation.

Apply this diff to fix the issue:

 if err := i.file.UploadExportProjectZip(ctx, zipFile); err != nil {
     return err
 }
+if err := zipFile.Close(); err != nil {
+    return err
+}
 return nil
Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
fileWriter, err := zipWriter.Create("project.json")
if err != nil {
return nil, nil, nil, err
return err
}
layerStyles, err := i.layerStyles.FindByScene(ctx, sceneID)
jsonData, err := json.Marshal(data)
if err != nil {
return nil, nil, nil, err
return err
}
storyList, err := i.storytellingRepo.FindByScene(ctx, sceneID)
if err != nil {
return nil, nil, nil, err
}
sceneJSON, err := builder.New(
repo.LayerLoaderFrom(i.layerRepo),
repo.PropertyLoaderFrom(i.propertyRepo),
repo.DatasetGraphLoaderFrom(i.datasetRepo),
repo.TagLoaderFrom(i.tagRepo),
repo.TagSceneLoaderFrom(i.tagRepo, []id.SceneID{sceneID}),
repo.NLSLayerLoaderFrom(i.nlsLayerRepo),
true,
).ForScene(sce).WithNLSLayers(&nlsLayers).WithLayerStyle(layerStyles).WithStory((*storyList)[0]).BuildResult(
ctx,
time.Now(),
prj.CoreSupport(),
prj.EnableGA(),
prj.TrackingID(),
)
if _, err = fileWriter.Write(jsonData); err != nil {
return err
}
if err := zipWriter.Close(); err != nil {
return err
}
// flush once
if err := zipFile.Close(); err != nil {
return err
}
zipFile, err = os.Open(zipFile.Name())
if err != nil {
return nil, nil, nil, err
return err
}
res := make(map[string]interface{})
res["scene"] = sceneJSON
return prj, res, plgs, nil
if err := i.file.UploadExportProjectZip(ctx, zipFile); err != nil {
return err
}
return nil
fileWriter, err := zipWriter.Create("project.json")
if err != nil {
return err
}
jsonData, err := json.Marshal(data)
if err != nil {
return err
}
if _, err = fileWriter.Write(jsonData); err != nil {
return err
}
if err := zipWriter.Close(); err != nil {
return err
}
// flush once
if err := zipFile.Close(); err != nil {
return err
}
zipFile, err = os.Open(zipFile.Name())
if err != nil {
return err
}
if err := i.file.UploadExportProjectZip(ctx, zipFile); err != nil {
return err
}
if err := zipFile.Close(); err != nil {
return err
}
return nil

Comment on lines +508 to +525
if prj.ImageURL() != nil {
trimmedName := strings.TrimPrefix(prj.ImageURL().Path, "/assets/")
stream, err := i.file.ReadAsset(ctx, trimmedName)
if err != nil {
return nil, err
}
zipEntryPath := fmt.Sprintf("assets/%s", trimmedName)
zipEntry, err := zipWriter.Create(zipEntryPath)
if err != nil {
return nil, err
}
_, err = io.Copy(zipEntry, stream)
if err != nil {
_ = stream.Close()
return nil, err
}
if err := stream.Close(); err != nil {
return nil, err
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Use defer to ensure stream is closed properly

To prevent resource leaks and to make the code cleaner, consider using defer stream.Close() right after opening the stream. This ensures that the stream is closed regardless of how the function exits.

Apply this diff to refactor the code:

 if prj.ImageURL() != nil {
     trimmedName := strings.TrimPrefix(prj.ImageURL().Path, "/assets/")
     stream, err := i.file.ReadAsset(ctx, trimmedName)
     if err != nil {
         return nil, err
     }
+    defer func() {
+        if closeErr := stream.Close(); closeErr != nil {
+            // Handle the close error if necessary
+        }
+    }()
     zipEntryPath := fmt.Sprintf("assets/%s", trimmedName)
     zipEntry, err := zipWriter.Create(zipEntryPath)
     if err != nil {
         return nil, err
     }
     _, err = io.Copy(zipEntry, stream)
     if err != nil {
         return nil, err
     }
-    if err := stream.Close(); err != nil {
-        return nil, err
-    }
 }
Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
if prj.ImageURL() != nil {
trimmedName := strings.TrimPrefix(prj.ImageURL().Path, "/assets/")
stream, err := i.file.ReadAsset(ctx, trimmedName)
if err != nil {
return nil, err
}
zipEntryPath := fmt.Sprintf("assets/%s", trimmedName)
zipEntry, err := zipWriter.Create(zipEntryPath)
if err != nil {
return nil, err
}
_, err = io.Copy(zipEntry, stream)
if err != nil {
_ = stream.Close()
return nil, err
}
if err := stream.Close(); err != nil {
return nil, err
if prj.ImageURL() != nil {
trimmedName := strings.TrimPrefix(prj.ImageURL().Path, "/assets/")
stream, err := i.file.ReadAsset(ctx, trimmedName)
if err != nil {
return nil, err
}
defer func() {
if closeErr := stream.Close(); closeErr != nil {
// Handle the close error if necessary
}
}()
zipEntryPath := fmt.Sprintf("assets/%s", trimmedName)
zipEntry, err := zipWriter.Create(zipEntryPath)
if err != nil {
return nil, err
}
_, err = io.Copy(zipEntry, stream)
if err != nil {
return nil, err
}
}

Comment on lines +859 to +861
parts := strings.Split(url, "/")
fileName := parts[len(parts)-1]

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ensure robust filename extraction from URL

When extracting the filename from the URL, if the URL ends with a slash or does not contain any slashes, fileName may be empty or cause an 'index out of range' error.

Consider checking that parts has at least one element and fileName is not empty:

    parts := strings.Split(url, "/")
+   if len(parts) == 0 {
+       return errors.New("invalid URL format")
+   }
    fileName := parts[len(parts)-1]
+   if fileName == "" {
+       return errors.New("empty filename extracted from URL")
+   }
Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
parts := strings.Split(url, "/")
fileName := parts[len(parts)-1]
parts := strings.Split(url, "/")
if len(parts) == 0 {
return errors.New("invalid URL format")
}
fileName := parts[len(parts)-1]
if fileName == "" {
return errors.New("empty filename extracted from URL")
}

Sanitize filenames to prevent security risks

Extracting filenames directly from URLs without sanitization can lead to security vulnerabilities, such as path traversal attacks. Ensure that fileName is sanitized to prevent malicious input.

Use path.Base from the path package to safely extract the filename:

-   parts := strings.Split(url, "/")
-   fileName := parts[len(parts)-1]
+   fileName := path.Base(url)

This ensures that only the base filename is used, stripping any directory components.

Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
parts := strings.Split(url, "/")
fileName := parts[len(parts)-1]
fileName := path.Base(url)
```
This suggestion replaces the original two lines with a single line using `path.Base`, which safely extracts the filename from the URL. Note that this suggestion assumes that the `path` package is already imported in the file. If it's not, an import statement for `path` should be added at the top of the file:
```go
import "path"

if err != nil {
return nil, nil, err
}
story := (*storyList)[0]
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Handle empty storytelling list to prevent panic

At line 618, you are accessing the first element of storyList without checking if the list is empty. If storyList is empty, this will cause a runtime panic due to an 'index out of range' error.

Consider adding a check to verify that storyList is not empty before accessing its first element:

+   var story *storytelling.Story
    if len(*storyList) > 0 {
+       story = (*storyList)[0]
+   } else {
+       // Handle the case where there is no storytelling data
+       story = nil
+   }

Ensure that subsequent code handles the nil case appropriately.

Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
story := (*storyList)[0]
var story *storytelling.Story
if len(*storyList) > 0 {
story = (*storyList)[0]
} else {
// Handle the case where there is no storytelling data
story = nil
}

Comment on lines +668 to +682
for _, item := range property.Items() {
for _, field := range item.Fields(nil) {
if field.GuessSchema().ID().String() == "buttonIcon" {
u, ok := field.Value().Value().(*url.URL)
if !ok {
continue
}
err := i.addZipAsset(ctx, zipWriter, u.Path)
if err != nil {
return nil, nil, err
}
}
}
}
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Check for nil properties to avoid potential nil pointer dereferences

In the loop starting at line 668, when iterating over property.Items(), ensure that both item and field.Value() are not nil before calling methods on them. This prevents possible nil pointer dereferences.

Add nil checks before accessing methods:

    for _, item := range property.Items() {
+       if item == nil {
+           continue
+       }
        for _, field := range item.Fields(nil) {
+           if field == nil || field.Value() == nil || field.Value().Value() == nil {
+               continue
+           }
            // existing code
        }
    }
Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
for _, item := range property.Items() {
for _, field := range item.Fields(nil) {
if field.GuessSchema().ID().String() == "buttonIcon" {
u, ok := field.Value().Value().(*url.URL)
if !ok {
continue
}
err := i.addZipAsset(ctx, zipWriter, u.Path)
if err != nil {
return nil, nil, err
}
}
}
}
}
for _, item := range property.Items() {
if item == nil {
continue
}
for _, field := range item.Fields(nil) {
if field == nil || field.Value() == nil || field.Value().Value() == nil {
continue
}
if field.GuessSchema().ID().String() == "buttonIcon" {
u, ok := field.Value().Value().(*url.URL)
if !ok {
continue
}
err := i.addZipAsset(ctx, zipWriter, u.Path)
if err != nil {
return nil, nil, err
}
}
}
}
}

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 8

Outside diff range and nitpick comments (1)
server/pkg/scene/builder/decoder.go (1)

153-154: Remove commented-out code to enhance code cleanliness

The commented line // Links(flinks). appears to be obsolete or unused.

Removing such commented-out code improves readability and reduces confusion for future maintainers.

Apply this diff to remove the commented code:

 field := property.NewField(*fieldID).
     Value(ov).
-    // Links(flinks).
     Build()
Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL

Commits

Files that changed from the base of the PR and between 11fed7c and 9f67a8d.

Files selected for processing (6)
  • server/internal/adapter/gql/resolver_mutation_project.go (3 hunks)
  • server/internal/usecase/interactor/scene.go (10 hunks)
  • server/internal/usecase/interactor/style.go (3 hunks)
  • server/internal/usecase/interfaces/scene.go (2 hunks)
  • server/pkg/file/zip.go (2 hunks)
  • server/pkg/scene/builder/decoder.go (2 hunks)
Files skipped from review as they are similar to previous changes (1)
  • server/internal/usecase/interactor/style.go
Additional comments not posted (7)
server/internal/usecase/interfaces/scene.go (2)

4-10: LGTM!

The new imports archive/zip and github.com/reearth/reearth/server/pkg/plugin have been added correctly. This aligns with the PR objective of introducing ZIP file export functionality and indicates that the code will likely involve working with ZIP files and plugins.


36-37: Verify the method implementations.

The new ExportScene method and the updated ImportScene method signatures in the Scene interface look good and align with the PR objectives.

However, please ensure that:

  1. The actual implementations of these methods correctly handle the parameters and return values as per the updated signatures.
  2. The ImportScene method implementations are updated to handle the new []*plugin.Plugin parameter, as this is a breaking change.

Run the following script to verify the method implementations:

Verification successful

Method implementations verified and aligned with interface.

The ExportScene and ImportScene method implementations have been found in server/internal/usecase/interactor/scene.go. Both implementations align with the interface signatures we reviewed earlier:

  1. ExportScene:
    func (i *Scene) ExportScene(ctx context.Context, prj *project.Project, zipWriter *zip.Writer) (*scene.Scene, map[string]interface{}, error)

  2. ImportScene:
    func (i *Scene) ImportScene(ctx context.Context, prj *project.Project, plgs []*plugin.Plugin, sceneData map[string]interface{}) (*scene.Scene, error)

The ImportScene method has been correctly updated to include the new []*plugin.Plugin parameter, addressing the breaking change we noted earlier.

While the signatures match the interface, I recommend a manual review of the implementation details to ensure correct handling of parameters and return values, especially for the new plugin parameter in ImportScene.

Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Verify the ExportScene and ImportScene method implementations.

# Test 1: Search for the ExportScene method implementation.
# Expect: The implementation should exist and align with the interface signature.
ast-grep --lang go --pattern $'func $_($_, $_, $_) ($_, $_, $_) {
  $$$
}' | rg --type go $'ExportScene'

# Test 2: Search for the ImportScene method implementation.
# Expect: The implementation should exist and align with the updated interface signature.
ast-grep --lang go --pattern $'func $_($_, $_, $_, $_) ($_, $_) {
  $$$
}' | rg --type go $'ImportScene'

Length of output: 205


Script:

#!/bin/bash
# Description: Broader search for ExportScene and ImportScene implementations

# Test 1: Search for any occurrences of ExportScene in Go files
echo "Searching for ExportScene:"
rg --type go "ExportScene"

# Test 2: Search for any occurrences of ImportScene in Go files
echo "Searching for ImportScene:"
rg --type go "ImportScene"

# Test 3: List Go files in the project
echo "Listing Go files:"
fd -e go

# Test 4: Search for potential implementation files
echo "Searching for potential implementation files:"
fd -e go | rg "scene|usecase"

Length of output: 35601

server/internal/adapter/gql/resolver_mutation_project.go (2)

131-138: ** Issue: Deleting the exported ZIP file before it can be downloaded**

The deferred function removes the ZIP file immediately after ExportProject returns. Since the ProjectDataPath returned points to this ZIP file, deleting it prematurely will prevent clients from accessing the file.

To fix this issue, consider the following options:

  • Option 1: Remove the os.Remove(zipFile.Name()) call from the deferred function to allow the file to persist. Implement a separate cleanup mechanism, such as a scheduled task, to remove old export files after a certain period.

  • Option 2: Move the file deletion logic to occur after the file has been successfully downloaded by the client.


247-257: ** Ensure transactions are properly rolled back on import errors**

After importing the scene, NLS layers, styles, and storytelling data, if any of these operations fail, the transaction may still be committed, potentially leaving the database in an inconsistent state.

Modify the error handling to roll back the transaction if any import operation fails:

if err != nil {
    tx.End(ctx) // Rollback
    return nil, err
}

// Commit only if all operations succeed
if err := tx.Commit(); err != nil {
    return nil, err
}

This ensures data integrity by preventing partial imports.

server/internal/usecase/interactor/scene.go (2)

598-716: LGTM!

The ExportScene function looks good and follows a clear logic to export a scene to a ZIP file. It retrieves all the necessary data, builds a JSON representation using a builder pattern, and adds asset files to the ZIP archive. Error handling is done consistently throughout the function.

The introduction of this feature for exporting scenes as ZIP files could be useful for various purposes such as backups, sharing, or migration.


Line range hint 718-841: LGTM!

The modifications to the ImportScene function look good. The addition of the plgs parameter allows for including plugins during scene import, which enhances the functionality and flexibility. The parsing of the scene JSON, creation of scene entities, and saving of the imported scene are handled properly. Error handling is done consistently throughout the function.

The changes don't seem to have any negative impact on existing functionalities.

server/pkg/scene/builder/decoder.go (1)

177-184: New function parsePropertyOptionalValue enhances modularity

The addition of parsePropertyOptionalValue correctly encapsulates the logic for parsing optional property values, improving the modularity of the code.

Comment on lines +91 to +94
fileBytes, err := io.ReadAll(file)
if err != nil {
return nil, nil, nil, err
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Consider streaming the data instead of reading it into memory.

Reading the entire file into memory could cause issues for large files. Consider using io.Copy to stream the data directly from the io.ReadSeeker to the zip.NewReader like this:

reader, err := zip.NewReader(file, file.Size())
if err != nil {
    return nil, nil, nil, err
}

Comment on lines +108 to +112
defer func(rc io.ReadCloser) {
if cerr := rc.Close(); cerr != nil {
fmt.Printf("Error closing file: %v\n", cerr)
}
}(rc)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Move the defer statement outside the loop.

Using defer inside the loop is inefficient as it will create a new deferred call for each iteration. Move the defer statement outside the loop:

rc, err := file.Open()
if err != nil {
    return nil, nil, nil, err
}
defer func(rc io.ReadCloser) {
    if cerr := rc.Close(); cerr != nil {
        fmt.Printf("Error closing file: %v\n", cerr)
    }
}(rc)
data, err = io.ReadAll(rc)
if err != nil {
    return nil, nil, nil, err
}

Comment on lines +103 to +116
if file.Name == "project.json" {
rc, err := file.Open()
if err != nil {
return nil, nil, nil, err
}
defer func(rc io.ReadCloser) {
if cerr := rc.Close(); cerr != nil {
fmt.Printf("Error closing file: %v\n", cerr)
}
}(rc)
data, err = io.ReadAll(rc)
if err != nil {
return nil, nil, nil, err
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Handle the case where project.json is missing.

The function assumes that project.json exists in the zip file. It should return an error if this file is missing:

var data []byte
var projectJSONFound bool
for _, file := range reader.File {
    if file.Name == "project.json" {
        projectJSONFound = true
        // ... read project.json data ...
    }
    // ... handle other files ...
}
if !projectJSONFound {
    return nil, nil, nil, fmt.Errorf("project.json not found in the zip file")
}

Comment on lines +90 to +126
func UncompressExportZip(file io.ReadSeeker) ([]byte, map[string]*zip.File, map[string]*zip.File, error) {
fileBytes, err := io.ReadAll(file)
if err != nil {
return nil, nil, nil, err
}
reader, err := zip.NewReader(bytes.NewReader(fileBytes), int64(len(fileBytes)))
if err != nil {
return nil, nil, nil, err
}
var data []byte
assets := make(map[string]*zip.File)
plugins := make(map[string]*zip.File)
for _, file := range reader.File {
if file.Name == "project.json" {
rc, err := file.Open()
if err != nil {
return nil, nil, nil, err
}
defer func(rc io.ReadCloser) {
if cerr := rc.Close(); cerr != nil {
fmt.Printf("Error closing file: %v\n", cerr)
}
}(rc)
data, err = io.ReadAll(rc)
if err != nil {
return nil, nil, nil, err
}
} else if strings.HasPrefix(file.Name, "assets/") {
trimmedName := strings.TrimPrefix(file.Name, "assets/")
assets[trimmedName] = file
} else if strings.HasPrefix(file.Name, "plugins/") {
trimmedName := strings.TrimPrefix(file.Name, "plugins/")
plugins[trimmedName] = file
}
}
return data, assets, plugins, nil
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Validate the zip file structure and return an error if it's invalid.

The function assumes a specific structure for the zip file. It should validate this structure and return an error if it's invalid. For example:

for _, file := range reader.File {
    if !strings.HasPrefix(file.Name, "assets/") &&
       !strings.HasPrefix(file.Name, "plugins/") &&
       file.Name != "project.json" {
        return nil, nil, nil, fmt.Errorf("invalid file in zip: %s", file.Name)
    }
    // ... handle files ...
}

Comment on lines +117 to +123
} else if strings.HasPrefix(file.Name, "assets/") {
trimmedName := strings.TrimPrefix(file.Name, "assets/")
assets[trimmedName] = file
} else if strings.HasPrefix(file.Name, "plugins/") {
trimmedName := strings.TrimPrefix(file.Name, "plugins/")
plugins[trimmedName] = file
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Close the readers for asset and plugin files.

The function doesn't close the readers it opens for asset and plugin files. This could lead to resource leaks. Use defer to ensure these readers are always closed:

} else if strings.HasPrefix(file.Name, "assets/") {
    rc, err := file.Open()
    if err != nil {
        return nil, nil, nil, err
    }
    defer rc.Close()
    trimmedName := strings.TrimPrefix(file.Name, "assets/")
    assets[trimmedName] = file
} else if strings.HasPrefix(file.Name, "plugins/") {
    rc, err := file.Open()
    if err != nil {
        return nil, nil, nil, err
    }
    defer rc.Close()
    trimmedName := strings.TrimPrefix(file.Name, "plugins/")
    plugins[trimmedName] = file
}

Comment on lines +125 to 129
if ok && ps != nil {
_, _, _, err := prop.UpdateValue(ps, ptr, pv)
if err != nil {
return nil, err
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ensure explicit error handling when property schema ps is nil

In the condition if ok && ps != nil, if ps is nil, the property update is silently skipped. This could lead to inconsistencies or unexpected behavior.

Consider returning an explicit error when ps is nil to prevent silent failures and make debugging easier.

Comment on lines +145 to +158
for fieldKey, value := range items {
if fieldKey == "id" {
continue
}
ov, ok := parsePropertyOptionalValue(value)
if ok {
fieldID := id.PropertyFieldIDFromRef(&fieldKey)
field := property.NewField(*fieldID).
Value(ov).
// Links(flinks).
Build()
g.AddFields(field)
}
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Handle parsing failures explicitly when parsing group property fields

Currently, if parsePropertyOptionalValue(value) returns false, the field is silently skipped. This means any parsing errors won't be reported.

Consider logging a warning or error when parsing fails to ensure that all intended fields are processed and to aid in debugging.

Comment on lines +112 to +160
for sgKey, value := range pj {

if items, ok := value.(map[string]interface{}); ok {
// simple property

sgID := id.PropertySchemaGroupIDFromRef(&sgKey)

for fieldKey, value := range items {

fieldID := id.PropertyFieldIDFromRef(&fieldKey)
ptr := property.NewPointer(sgID, nil, fieldID)
pv, ok := parsePropertyValue(value)

if ok && ps != nil {
_, _, _, err := prop.UpdateValue(ps, ptr, pv)
if err != nil {
return nil, err
}
}
}

} else if arrayProperty, ok := value.([]interface{}); ok {
// group property

for _, groupProperty := range arrayProperty {

sg := id.PropertySchemaGroupID(sgKey)
gl := prop.GetOrCreateGroupList(ps, property.PointItemBySchema(sg))
g := property.NewGroup().NewID().SchemaGroup(sg).MustBuild()
gl.Add(g, -1)

if items, ok := groupProperty.(map[string]interface{}); ok {

for fieldKey, value := range items {
if fieldKey == "id" {
continue
}
ov, ok := parsePropertyOptionalValue(value)
if ok {
fieldID := id.PropertyFieldIDFromRef(&fieldKey)
field := property.NewField(*fieldID).
Value(ov).
// Links(flinks).
Build()
g.AddFields(field)
}
}
}
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Refactor to reduce duplication in property processing logic

The loops handling simple properties (lines 114–132) and group properties (lines 133–160) contain similar code for iterating over fields and parsing values.

Consider extracting common functionality into helper functions to reduce code duplication and improve maintainability.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant