-
Notifications
You must be signed in to change notification settings - Fork 43
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(server): Import or Export Project Functionality (File resource) #1151
base: main
Are you sure you want to change the base?
Conversation
✅ Deploy Preview for reearth-web canceled.
|
WalkthroughThe changes introduce significant modifications to the project's GraphQL API and related functionalities, focusing on the export and import of project data. The Changes
Possibly related PRs
Suggested reviewers
Poem
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
Documentation and Community
|
736750f
to
2824545
Compare
2824545
to
11fed7c
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 25
Outside diff range and nitpick comments (8)
server/e2e/gql_import_export_test.go (1)
49-53
: Note: Potential future implementation.The commented-out section suggests a potential future implementation where the response body is downloaded and processed. As mentioned in the AI-generated summary, this functionality is not currently active.
Consider removing the commented-out code if it's not planned for implementation in the near future, or add a TODO comment to clarify the intended functionality and timeline.
server/internal/usecase/interactor/plugin.go (2)
Line range hint
112-162
: Simplify plugin import by batching repository operationsIn the
ImportPlugins
function, each plugin is being saved individually within the loop, which can be inefficient due to multiple database operations. Consider collecting all non-system plugins and saving them in a batch after the loop to improve performance.Refactor the code as follows:
var pluginsToSave []*plugin.Plugin for _, pluginJSON := range pluginsJSON { // ... [existing code] ... if !p.ID().System() { - if err := i.pluginRepo.Save(ctx, p); err != nil { - return nil, err - } + pluginsToSave = append(pluginsToSave, p) } } + // Save all plugins in a batch + if len(pluginsToSave) > 0 { + if err := i.pluginRepo.SaveAll(ctx, pluginsToSave); err != nil { + return nil, err + } + }This change reduces the number of database calls and can improve the performance of the import operation.
62-107
: Handle errors more explicitly in theExportPlugins
functionWhile the current error handling passes the error up the call stack, adding context to the errors can make debugging easier. Use
fmt.Errorf
or wrap errors to provide more detailed messages.For example:
plgs, err := i.pluginRepo.FindByIDs(ctx, filteredPluginIDs) if err != nil { - return nil, err + return nil, fmt.Errorf("failed to find plugins by IDs: %w", err) } // Similarly for other errors within the function zipEntry, err := zipWriter.Create(zipEntryPath) if err != nil { - return nil, err + return nil, fmt.Errorf("failed to create zip entry '%s': %w", zipEntryPath, err) }This provides more context if an error occurs, aiding in troubleshooting.
server/internal/usecase/interactor/project.go (1)
622-625
: Re-fetching the project may be unnecessaryAfter importing and saving the project, fetching it again using
FindByID
might be redundant if there are no additional changes or side effects that need to be captured. Consider returning the project object directly to improve performance.server/internal/usecase/interactor/nlslayer.go (2)
Line range hint
846-868
: Potential conflict when creating layers with existing IDsThe function creates new
NLSLayer
instances using IDs parsed from the input JSON without checking if those IDs already exist in the repository. This could lead to overwriting existing layers or unexpected behavior if anNLSLayer
with the same ID already exists. Consider adding a check to determine if a layer with the same ID exists before creating a new instance or handle the possibility of updating existing layers.
879-883
: Unnecessary repository query after saving layersAfter saving the new
NLSLayer
instances, the function queries the repository to fetch them again using their IDs. Since the layers are already available in thenlayers
slice, this may introduce unnecessary overhead. Consider returning thenlayers
directly unless there's a specific reason to refetch from the repository.server/internal/usecase/interactor/scene.go (2)
862-878
: Usedefer
to ensurestream
is closed properlyCurrently,
stream.Close()
is called manually afterio.Copy
, and in one error case. Usingdefer
guarantees thatstream.Close()
is called, even if an error occurs duringio.Copy
.Modify the code to use
defer
:stream, err := i.file.ReadAsset(ctx, fileName) if err != nil { return err } + defer stream.Close() zipEntryPath := fmt.Sprintf("assets/%s", fileName) zipEntry, err := zipWriter.Create(zipEntryPath) if err != nil { return err } _, err = io.Copy(zipEntry, stream) if err != nil { return err } - if err := stream.Close(); err != nil { - return err - }
826-829
: Redundant scene fetching after importAfter importing the scene, you are fetching it again from the repository at lines 826-829. Since you already have the
scene
object, this may not be necessary unless you need to refresh its state.Consider removing the redundant fetch to optimize performance:
- scene, err = i.sceneRepo.FindByID(ctx, sceneID) - if err != nil { - return nil, err - }
Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Files selected for processing (23)
- server/e2e/gql_import_export_test.go (2 hunks)
- server/gql/project.graphql (1 hunks)
- server/internal/adapter/gql/generated.go (8 hunks)
- server/internal/adapter/gql/gqlmodel/models_gen.go (1 hunks)
- server/internal/adapter/gql/resolver_mutation_project.go (4 hunks)
- server/internal/app/file.go (1 hunks)
- server/internal/infrastructure/fs/common.go (1 hunks)
- server/internal/infrastructure/fs/file.go (1 hunks)
- server/internal/infrastructure/gcs/file.go (4 hunks)
- server/internal/infrastructure/s3/s3.go (4 hunks)
- server/internal/usecase/gateway/file.go (2 hunks)
- server/internal/usecase/interactor/asset.go (3 hunks)
- server/internal/usecase/interactor/nlslayer.go (3 hunks)
- server/internal/usecase/interactor/plugin.go (3 hunks)
- server/internal/usecase/interactor/project.go (3 hunks)
- server/internal/usecase/interactor/scene.go (7 hunks)
- server/internal/usecase/interactor/storytelling.go (1 hunks)
- server/internal/usecase/interactor/style.go (3 hunks)
- server/internal/usecase/interfaces/asset.go (2 hunks)
- server/internal/usecase/interfaces/plugin.go (2 hunks)
- server/internal/usecase/interfaces/project.go (2 hunks)
- server/internal/usecase/interfaces/scene.go (2 hunks)
- web/src/services/gql/gen/graphql.ts (5 hunks)
Additional context used
Learnings (1)
server/e2e/gql_import_export_test.go (2)
Learnt from: hexaforce PR: reearth/reearth-visualizer#1141 File: server/e2e/gql_import_export_test.go:48-80 Timestamp: 2024-09-13T01:19:11.267Z Learning: In `server/e2e/gql_import_export_test.go`, test functions like `TestCallImportProject` may have disabled assertions due to problems with GraphQL.
Learnt from: hexaforce PR: reearth/reearth-visualizer#1141 File: server/e2e/gql_import_export_test.go:12-44 Timestamp: 2024-09-13T01:19:05.869Z Learning: The response body assertions in `TestCallExportProject` are disabled due to existing problems with GraphQL.
Additional comments not posted (35)
server/internal/adapter/gql/generated.go (8)
342-342
: LGTM!The field name change from
ProjectData
toProjectDataPath
in theExportProjectPayload
struct is consistent with the PR objectives and the AI-generated summary. The field type remains unchanged, indicating that the complexity calculation logic for the field is unaffected.
2591-2596
: LGTM!The changes in the complexity calculation logic align with the field name change from
ProjectData
toProjectDataPath
in theExportProjectPayload
struct. The complexity calculation for the field remains unchanged.
9614-9614
: LGTM!The field name change from
projectData
toprojectDataPath
and the field type change fromJSON!
toString!
in theExportProjectPayload
type are consistent with the PR objectives and the AI-generated summary.
19093-19094
: LGTM!The addition of the new resolver function
_ExportProjectPayload_projectDataPath
for theprojectDataPath
field in theExportProjectPayload
type is consistent with the field name change. The function follows the same pattern as other resolver functions in the file.
19107-19107
: LGTM!The resolver function
_ExportProjectPayload_projectDataPath
returns the value ofobj.ProjectDataPath
, which is consistent with the field name change in theExportProjectPayload
struct.
19119-19121
: LGTM!The changes in the resolver function
_ExportProjectPayload_projectDataPath
, including casting the resolved value to a string, assigning it tofc.Result
, and marshaling it using themarshalNString2string
function, are consistent with the field type change fromJSON!
toString!
in theExportProjectPayload
type.The addition of the new field context function
fieldContext_ExportProjectPayload_projectDataPath
for theprojectDataPath
field is consistent with the field name change. The function correctly returns an error if the field has child fields, as the field type isString
.Also applies to: 19124-19131
31822-31823
: LGTM!The changes in the switch statement, including the removal of the case for
"projectData"
and the addition of the case for"projectDataPath"
, are consistent with the field name change in theExportProjectPayload
type. The new case correctly calls thefieldContext_ExportProjectPayload_projectDataPath
function to retrieve the field context for theprojectDataPath
field.
65749-65750
: LGTM!The changes in the switch statement, including the removal of the case for
"projectData"
and the addition of the case for"projectDataPath"
, are consistent with the field name change in theExportProjectPayload
type. The new case correctly calls the_ExportProjectPayload_projectDataPath
function to retrieve the value of theprojectDataPath
field.web/src/services/gql/__gen__/graphql.ts (5)
559-566
: LGTM!The new
ExportProjectInput
andExportProjectPayload
types are well-defined and follow the GraphQL schema design principles. They enable the project export functionality as described in the PR objectives.
621-628
: LGTM!The new
ImportProjectInput
andImportProjectPayload
types are well-defined and follow the GraphQL schema design principles. They enable the project import functionality as described in the PR objectives.
1040-1044
: LGTM!The new
exportProject
andimportProject
mutation fields are correctly added to theMutation
type. They expose the project export and import functionalities through the GraphQL API, with the return types matching the respective payload types.
1288-1290
: LGTM!The new
MutationExportProjectArgs
type correctly defines the input arguments for theexportProject
mutation. It expects anExportProjectInput
object as input and follows the GraphQL schema design principles.
1308-1310
: LGTM!The new
MutationImportProjectArgs
type correctly defines the input arguments for theimportProject
mutation. It expects anImportProjectInput
object as input and follows the GraphQL schema design principles.server/internal/infrastructure/fs/common.go (1)
8-8
: LGTM! The addition ofexportDir
constant enhances the file system structure.The new constant
exportDir
with the value"export"
is a straightforward addition that does not alter existing logic or functionality. It represents a directory path for export-related functionalities, suggesting an enhancement in the file system structure for better organization and management of exported resources.This change is unlikely to have any negative impact on the existing codebase as it is a new addition and does not modify existing code.
server/internal/usecase/interfaces/plugin.go (1)
25-25
: LGTM!The
ExportPlugins
method signature aligns with the PR objective of exporting plugins associated with a given scene. The parameters and return values are appropriate for the intended functionality.server/internal/usecase/interfaces/asset.go (2)
4-4
: LGTM!The import of the
archive/zip
package is valid and necessary for the changes introduced in this file.
39-39
: Looks good! This addition enhances the Asset interface.The new
UploadAssetFile
method is a valuable addition to theAsset
interface, as it enables the uploading of asset files, which is crucial for the import/export functionality being introduced in this PR.The method signature and return types are appropriate:
- The
context.Context
parameter allows for request-scoped values and cancellation.- The
string
parameter likely represents the file name or path.- The
*zip.File
parameter indicates that the method will handle operations related to zip file handling, aligning with the goal of using ZIP files for project data import/export.- The
*url.URL
return type suggests that the method will return a URL pointing to the uploaded asset, providing a way to access the uploaded file.- The
int64
return type likely represents the file size, which can be useful metadata.- The
error
return type allows for proper error handling.Overall, this change expands the capabilities of the
Asset
interface, enabling more complex asset management operations and supporting the import/export functionality being introduced in this PR.server/e2e/gql_import_export_test.go (2)
29-29
: LGTM!The change in the GraphQL mutation query aligns with the shift in the expected response structure, as mentioned in the AI-generated summary. The test now focuses on validating the path to the exported project data rather than the data itself.
Line range hint
35-47
: LGTM!The change in the test's response handling aligns with the shift in the expected response structure, as mentioned in the AI-generated summary. The test now extracts a string value from
projectDataPath
instead of an object fromprojectData
.Note: As per the learnings, some assertions in the test functions are disabled due to problems with GraphQL. Keep this in mind when reviewing the test results.
server/internal/usecase/gateway/file.go (3)
8-8
: LGTM!The import statement for the
os
package is correctly placed and suggests that the code might involve file or directory related operations.
40-40
: LGTM!The
ReadExportProjectZip
method is a good addition to theFile
interface. It expands the interface's capabilities to support reading of project export ZIP files. The method signature is clear, follows Go conventions, and appropriately usescontext.Context
for request-scoped data.
41-42
: LGTM!The
UploadExportProjectZip
andRemoveExportProjectZip
methods are good additions to theFile
interface. They expand the interface's capabilities to support uploading and removing of project export ZIP files. The method signatures are clear, follow Go conventions, and appropriately usecontext.Context
for request-scoped data. Using*os.File
as a parameter inUploadExportProjectZip
is a good choice as it allows access to file metadata if needed.server/internal/app/file.go (1)
49-61
: LGTM!The new route handler for exporting project files looks good. It follows the standard pattern of using a
fileHandler
function to stream the file response. The error handling is done correctly by returningErrNotFound
if the file is not found. The removal of the file after reading is a good practice to manage the lifecycle of exported files and ensure they are downloaded only once. The content type of the response is determined based on the file extension, which is a common approach.Overall, the code segment is well-structured, follows best practices, and enhances the server's capabilities by allowing clients to download exported project files.
server/internal/usecase/interfaces/scene.go (2)
4-4
: LGTM!The
archive/zip
package import is necessary to support the newExportScene
method.
35-35
: Approve the addition ofExportScene
method to theScene
interface.The method signature is well-defined and follows the existing convention in the interface. The addition of this method enhances the functionality of the
Scene
interface by enabling the exportation of scene data.Please ensure that the implementation of this method is thoroughly reviewed to verify correctness, error handling, and performance.
server/gql/project.graphql (1)
119-119
: Verify the impact on clients and provide necessary documentation.The change from
projectData
toprojectDataPath
aligns with the PR objective of exporting project data as a ZIP file. However, this change may introduce breaking changes for clients that expect the project data directly in the response.Please ensure that:
- All clients consuming this GraphQL API are updated to handle the new response format and retrieve the project data from the provided file path.
- Necessary documentation or migration guides are provided to assist clients in adapting to this change.
server/internal/usecase/interactor/asset.go (1)
130-152
: LGTM! The newUploadAssetFile
function is a valuable addition.The
UploadAssetFile
function enhances the functionality of theAsset
struct by enabling the upload of files directly from a zip archive. This streamlines asset management processes by allowing users to upload assets in bulk.The function is well-implemented with proper error handling and resource management. It follows the existing coding conventions and integrates seamlessly with the rest of the codebase.
Some key points:
- It handles errors when opening the zip file entry.
- It defers the closing of the read stream to ensure proper resource management.
- It detects the content type using the
http.DetectContentType
function.- It constructs a
file.File
object with the necessary properties.- It calls the
UploadAsset
method and returns the result.Great job on this addition!
server/internal/usecase/interactor/style.go (1)
Line range hint
214-244
: LGTM! The changes improve the reliability of the imported styles.The introduction of the
styleIDs
variable to collect style IDs during the import process and the subsequent retrieval of the saved styles by their IDs usingi.styleRepo.FindByIDs
is a good approach.This change ensures that the function returns the styles that were actually saved to the repository, rather than the initially created
styleList
. This allows for additional processing or validation of the saved styles if needed in the future.The altered control flow, which now includes the additional step of fetching the saved styles, improves the reliability of the function's output.
server/internal/infrastructure/fs/file.go (1)
128-130
: LGTM!The
ReadExportProjectZip
method implementation looks good:
- It constructs the file path correctly by joining the
exportDir
and sanitizedfilename
.- It uses the
read
helper method to handle file reading and error checking.- The
filename
parameter is properly sanitized usingsanitize.Path
to prevent path traversal attacks.server/internal/infrastructure/s3/s3.go (2)
32-32
: LGTM!The constant
exportBasePath
is appropriately named and defined for handling export-related files.
217-223
: LGTM!The
ReadExportProjectZip
method is implemented correctly:
- It sanitizes the input to prevent potential security issues.
- It handles the case when the
name
is empty by returning an appropriate error.- It constructs the correct file path using the
exportBasePath
constant.- It calls the
read
method with the correct file path to read the ZIP file.server/internal/infrastructure/gcs/file.go (1)
208-214
: LGTM!The
ReadExportProjectZip
function is implemented correctly:
- It sanitizes the input
name
parameter to prevent potential path traversal attacks.- It handles the case when the sanitized name is empty by returning a not found error.
- It constructs the correct path for the export project ZIP file.
- It delegates the actual reading of the file to the
read
helper function.server/internal/usecase/interactor/storytelling.go (1)
1127-1130
: Excellent improvement to theImportStory
function!Retrieving the story from the repository after saving it is a great way to ensure data integrity. This change enhances the reliability of the function by:
Verifying that the story was successfully persisted and can be fetched from the database accurately. It protects against silent or partial failures during the save operation.
Returning the retrieved story object instead of the original ensures that any database generated fields (like timestamps) are populated correctly in the response.
Failing early if the retrieval doesn't succeed, indicating a problem with the save operation or the repository itself.
Overall, this makes the function more robust and fail-safe. Great work!
server/internal/adapter/gql/gqlmodel/models_gen.go (1)
566-566
: ModifyExportProjectPayload
to return project data file path instead of JSON data.The
ProjectData
field in theExportProjectPayload
struct has been changed fromJSON
type tostring
type and renamed toProjectDataPath
. This suggests that instead of directly including the project data in the GraphQL response, the response will now contain a file path or URL pointing to the location of the exported project data file (likely in ZIP format).This change will require updates to the client-side code to handle downloading and extracting the project data from the file, rather than directly accessing it from the GraphQL response. The server-side implementation will need to ensure that the project data is correctly exported to a file and the
ProjectDataPath
is set to a valid and accessible location.server/internal/usecase/interactor/plugin.go (1)
157-163
: Confirm that all imported plugins are properly retrievedAfter importing plugins, you fetch them using
FindByIDs
. Ensure that all plugins are successfully retrieved and handle cases where some plugins might not be found.Run the following script to confirm that all plugin IDs retrieved match the ones intended to import:
This script assumes you have access to commands that can list the plugin IDs from the repository. Adjust it accordingly to fit your environment.
ImportPlugins(context.Context, []interface{}) ([]*plugin.Plugin, error) | ||
ImporPluginFile(context.Context, id.PluginID, string, *zip.File) error |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please clarify the purpose of the string parameter.
The ImportPluginFile
method signature aligns with the PR objective of importing a plugin file. However, the purpose of the string parameter is unclear. Please provide a comment or rename the parameter to clarify its intended use.
func (f *fileRepo) UploadExportProjectZip(ctx context.Context, zipFile *os.File) error { | ||
_, err := f.upload(ctx, path.Join(exportDir, zipFile.Name()), zipFile) | ||
return err | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sanitize the zipFile.Name()
before constructing the destination file path.
The zipFile.Name()
is used directly to construct the destination file path without any sanitization. This could potentially lead to path traversal attacks if the zipFile
name is maliciously crafted.
Consider applying this diff to sanitize the zipFile.Name()
:
func (f *fileRepo) UploadExportProjectZip(ctx context.Context, zipFile *os.File) error {
- _, err := f.upload(ctx, path.Join(exportDir, zipFile.Name()), zipFile)
+ _, err := f.upload(ctx, path.Join(exportDir, sanitize.Path(zipFile.Name())), zipFile)
return err
}
Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
func (f *fileRepo) UploadExportProjectZip(ctx context.Context, zipFile *os.File) error { | |
_, err := f.upload(ctx, path.Join(exportDir, zipFile.Name()), zipFile) | |
return err | |
} | |
func (f *fileRepo) UploadExportProjectZip(ctx context.Context, zipFile *os.File) error { | |
_, err := f.upload(ctx, path.Join(exportDir, sanitize.Path(zipFile.Name())), zipFile) | |
return err | |
} |
func (f *fileRepo) RemoveExportProjectZip(ctx context.Context, filename string) error { | ||
return f.delete(ctx, filepath.Join(exportDir, filename)) | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sanitize the filename
before constructing the file path.
The filename
parameter is used directly to construct the file path without any sanitization. This could potentially lead to path traversal attacks if the filename
is maliciously crafted.
Consider applying this diff to sanitize the filename
:
func (f *fileRepo) RemoveExportProjectZip(ctx context.Context, filename string) error {
- return f.delete(ctx, filepath.Join(exportDir, filename))
+ return f.delete(ctx, filepath.Join(exportDir, sanitize.Path(filename)))
}
Committable suggestion was skipped due to low confidence.
func (f *fileRepo) RemoveExportProjectZip(ctx context.Context, filename string) error { | ||
return f.delete(ctx, path.Join(exportBasePath, filename)) | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sanitize the filename
before using it in the file path.
The RemoveExportProjectZip
method directly uses the filename
as part of the file path without sanitizing it. This could potentially lead to security issues if the filename is not trusted.
Consider applying this change:
- return f.delete(ctx, path.Join(exportBasePath, filename))
+ sanitizedFilename := sanitize.Path(filename)
+ return f.delete(ctx, path.Join(exportBasePath, sanitizedFilename))
Committable suggestion was skipped due to low confidence.
func (f *fileRepo) UploadExportProjectZip(ctx context.Context, zipFile *os.File) error { | ||
_, err := f.upload(ctx, path.Join(exportBasePath, zipFile.Name()), zipFile) | ||
return err | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sanitize the zipFile.Name()
before using it in the file path.
The UploadExportProjectZip
method directly uses the zipFile.Name()
as part of the file path without sanitizing it. This could potentially lead to security issues if the file name is not trusted.
Consider applying this change:
- _, err := f.upload(ctx, path.Join(exportBasePath, zipFile.Name()), zipFile)
+ sanitizedName := sanitize.Path(zipFile.Name())
+ _, err := f.upload(ctx, path.Join(exportBasePath, sanitizedName), zipFile)
Committable suggestion was skipped due to low confidence.
fileWriter, err := zipWriter.Create("project.json") | ||
if err != nil { | ||
return nil, nil, nil, err | ||
return err | ||
} | ||
layerStyles, err := i.layerStyles.FindByScene(ctx, sceneID) | ||
jsonData, err := json.Marshal(data) | ||
if err != nil { | ||
return nil, nil, nil, err | ||
return err | ||
} | ||
storyList, err := i.storytellingRepo.FindByScene(ctx, sceneID) | ||
if err != nil { | ||
return nil, nil, nil, err | ||
} | ||
sceneJSON, err := builder.New( | ||
repo.LayerLoaderFrom(i.layerRepo), | ||
repo.PropertyLoaderFrom(i.propertyRepo), | ||
repo.DatasetGraphLoaderFrom(i.datasetRepo), | ||
repo.TagLoaderFrom(i.tagRepo), | ||
repo.TagSceneLoaderFrom(i.tagRepo, []id.SceneID{sceneID}), | ||
repo.NLSLayerLoaderFrom(i.nlsLayerRepo), | ||
true, | ||
).ForScene(sce).WithNLSLayers(&nlsLayers).WithLayerStyle(layerStyles).WithStory((*storyList)[0]).BuildResult( | ||
ctx, | ||
time.Now(), | ||
prj.CoreSupport(), | ||
prj.EnableGA(), | ||
prj.TrackingID(), | ||
) | ||
if _, err = fileWriter.Write(jsonData); err != nil { | ||
return err | ||
} | ||
|
||
if err := zipWriter.Close(); err != nil { | ||
return err | ||
} | ||
|
||
// flush once | ||
if err := zipFile.Close(); err != nil { | ||
return err | ||
} | ||
zipFile, err = os.Open(zipFile.Name()) | ||
if err != nil { | ||
return nil, nil, nil, err | ||
return err | ||
} | ||
|
||
res := make(map[string]interface{}) | ||
res["scene"] = sceneJSON | ||
return prj, res, plgs, nil | ||
if err := i.file.UploadExportProjectZip(ctx, zipFile); err != nil { | ||
return err | ||
} | ||
return nil |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Close zipFile
after uploading to prevent resource leaks
After uploading the zipFile
using i.file.UploadExportProjectZip
, the file remains open, which could lead to resource leaks. Ensure that zipFile
is closed after the upload operation.
Apply this diff to fix the issue:
if err := i.file.UploadExportProjectZip(ctx, zipFile); err != nil {
return err
}
+if err := zipFile.Close(); err != nil {
+ return err
+}
return nil
Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
fileWriter, err := zipWriter.Create("project.json") | |
if err != nil { | |
return nil, nil, nil, err | |
return err | |
} | |
layerStyles, err := i.layerStyles.FindByScene(ctx, sceneID) | |
jsonData, err := json.Marshal(data) | |
if err != nil { | |
return nil, nil, nil, err | |
return err | |
} | |
storyList, err := i.storytellingRepo.FindByScene(ctx, sceneID) | |
if err != nil { | |
return nil, nil, nil, err | |
} | |
sceneJSON, err := builder.New( | |
repo.LayerLoaderFrom(i.layerRepo), | |
repo.PropertyLoaderFrom(i.propertyRepo), | |
repo.DatasetGraphLoaderFrom(i.datasetRepo), | |
repo.TagLoaderFrom(i.tagRepo), | |
repo.TagSceneLoaderFrom(i.tagRepo, []id.SceneID{sceneID}), | |
repo.NLSLayerLoaderFrom(i.nlsLayerRepo), | |
true, | |
).ForScene(sce).WithNLSLayers(&nlsLayers).WithLayerStyle(layerStyles).WithStory((*storyList)[0]).BuildResult( | |
ctx, | |
time.Now(), | |
prj.CoreSupport(), | |
prj.EnableGA(), | |
prj.TrackingID(), | |
) | |
if _, err = fileWriter.Write(jsonData); err != nil { | |
return err | |
} | |
if err := zipWriter.Close(); err != nil { | |
return err | |
} | |
// flush once | |
if err := zipFile.Close(); err != nil { | |
return err | |
} | |
zipFile, err = os.Open(zipFile.Name()) | |
if err != nil { | |
return nil, nil, nil, err | |
return err | |
} | |
res := make(map[string]interface{}) | |
res["scene"] = sceneJSON | |
return prj, res, plgs, nil | |
if err := i.file.UploadExportProjectZip(ctx, zipFile); err != nil { | |
return err | |
} | |
return nil | |
fileWriter, err := zipWriter.Create("project.json") | |
if err != nil { | |
return err | |
} | |
jsonData, err := json.Marshal(data) | |
if err != nil { | |
return err | |
} | |
if _, err = fileWriter.Write(jsonData); err != nil { | |
return err | |
} | |
if err := zipWriter.Close(); err != nil { | |
return err | |
} | |
// flush once | |
if err := zipFile.Close(); err != nil { | |
return err | |
} | |
zipFile, err = os.Open(zipFile.Name()) | |
if err != nil { | |
return err | |
} | |
if err := i.file.UploadExportProjectZip(ctx, zipFile); err != nil { | |
return err | |
} | |
if err := zipFile.Close(); err != nil { | |
return err | |
} | |
return nil |
if prj.ImageURL() != nil { | ||
trimmedName := strings.TrimPrefix(prj.ImageURL().Path, "/assets/") | ||
stream, err := i.file.ReadAsset(ctx, trimmedName) | ||
if err != nil { | ||
return nil, err | ||
} | ||
zipEntryPath := fmt.Sprintf("assets/%s", trimmedName) | ||
zipEntry, err := zipWriter.Create(zipEntryPath) | ||
if err != nil { | ||
return nil, err | ||
} | ||
_, err = io.Copy(zipEntry, stream) | ||
if err != nil { | ||
_ = stream.Close() | ||
return nil, err | ||
} | ||
if err := stream.Close(); err != nil { | ||
return nil, err |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Use defer
to ensure stream
is closed properly
To prevent resource leaks and to make the code cleaner, consider using defer stream.Close()
right after opening the stream
. This ensures that the stream is closed regardless of how the function exits.
Apply this diff to refactor the code:
if prj.ImageURL() != nil {
trimmedName := strings.TrimPrefix(prj.ImageURL().Path, "/assets/")
stream, err := i.file.ReadAsset(ctx, trimmedName)
if err != nil {
return nil, err
}
+ defer func() {
+ if closeErr := stream.Close(); closeErr != nil {
+ // Handle the close error if necessary
+ }
+ }()
zipEntryPath := fmt.Sprintf("assets/%s", trimmedName)
zipEntry, err := zipWriter.Create(zipEntryPath)
if err != nil {
return nil, err
}
_, err = io.Copy(zipEntry, stream)
if err != nil {
return nil, err
}
- if err := stream.Close(); err != nil {
- return nil, err
- }
}
Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
if prj.ImageURL() != nil { | |
trimmedName := strings.TrimPrefix(prj.ImageURL().Path, "/assets/") | |
stream, err := i.file.ReadAsset(ctx, trimmedName) | |
if err != nil { | |
return nil, err | |
} | |
zipEntryPath := fmt.Sprintf("assets/%s", trimmedName) | |
zipEntry, err := zipWriter.Create(zipEntryPath) | |
if err != nil { | |
return nil, err | |
} | |
_, err = io.Copy(zipEntry, stream) | |
if err != nil { | |
_ = stream.Close() | |
return nil, err | |
} | |
if err := stream.Close(); err != nil { | |
return nil, err | |
if prj.ImageURL() != nil { | |
trimmedName := strings.TrimPrefix(prj.ImageURL().Path, "/assets/") | |
stream, err := i.file.ReadAsset(ctx, trimmedName) | |
if err != nil { | |
return nil, err | |
} | |
defer func() { | |
if closeErr := stream.Close(); closeErr != nil { | |
// Handle the close error if necessary | |
} | |
}() | |
zipEntryPath := fmt.Sprintf("assets/%s", trimmedName) | |
zipEntry, err := zipWriter.Create(zipEntryPath) | |
if err != nil { | |
return nil, err | |
} | |
_, err = io.Copy(zipEntry, stream) | |
if err != nil { | |
return nil, err | |
} | |
} |
parts := strings.Split(url, "/") | ||
fileName := parts[len(parts)-1] | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ensure robust filename extraction from URL
When extracting the filename from the URL, if the URL ends with a slash or does not contain any slashes, fileName
may be empty or cause an 'index out of range' error.
Consider checking that parts
has at least one element and fileName
is not empty:
parts := strings.Split(url, "/")
+ if len(parts) == 0 {
+ return errors.New("invalid URL format")
+ }
fileName := parts[len(parts)-1]
+ if fileName == "" {
+ return errors.New("empty filename extracted from URL")
+ }
Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
parts := strings.Split(url, "/") | |
fileName := parts[len(parts)-1] | |
parts := strings.Split(url, "/") | |
if len(parts) == 0 { | |
return errors.New("invalid URL format") | |
} | |
fileName := parts[len(parts)-1] | |
if fileName == "" { | |
return errors.New("empty filename extracted from URL") | |
} |
Sanitize filenames to prevent security risks
Extracting filenames directly from URLs without sanitization can lead to security vulnerabilities, such as path traversal attacks. Ensure that fileName
is sanitized to prevent malicious input.
Use path.Base
from the path
package to safely extract the filename:
- parts := strings.Split(url, "/")
- fileName := parts[len(parts)-1]
+ fileName := path.Base(url)
This ensures that only the base filename is used, stripping any directory components.
Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
parts := strings.Split(url, "/") | |
fileName := parts[len(parts)-1] | |
fileName := path.Base(url) | |
``` | |
This suggestion replaces the original two lines with a single line using `path.Base`, which safely extracts the filename from the URL. Note that this suggestion assumes that the `path` package is already imported in the file. If it's not, an import statement for `path` should be added at the top of the file: | |
```go | |
import "path" |
if err != nil { | ||
return nil, nil, err | ||
} | ||
story := (*storyList)[0] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Handle empty storytelling list to prevent panic
At line 618, you are accessing the first element of storyList
without checking if the list is empty. If storyList
is empty, this will cause a runtime panic due to an 'index out of range' error.
Consider adding a check to verify that storyList
is not empty before accessing its first element:
+ var story *storytelling.Story
if len(*storyList) > 0 {
+ story = (*storyList)[0]
+ } else {
+ // Handle the case where there is no storytelling data
+ story = nil
+ }
Ensure that subsequent code handles the nil
case appropriately.
Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
story := (*storyList)[0] | |
var story *storytelling.Story | |
if len(*storyList) > 0 { | |
story = (*storyList)[0] | |
} else { | |
// Handle the case where there is no storytelling data | |
story = nil | |
} |
for _, item := range property.Items() { | ||
for _, field := range item.Fields(nil) { | ||
if field.GuessSchema().ID().String() == "buttonIcon" { | ||
u, ok := field.Value().Value().(*url.URL) | ||
if !ok { | ||
continue | ||
} | ||
err := i.addZipAsset(ctx, zipWriter, u.Path) | ||
if err != nil { | ||
return nil, nil, err | ||
} | ||
} | ||
} | ||
} | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Check for nil properties to avoid potential nil pointer dereferences
In the loop starting at line 668, when iterating over property.Items()
, ensure that both item
and field.Value()
are not nil
before calling methods on them. This prevents possible nil pointer dereferences.
Add nil checks before accessing methods:
for _, item := range property.Items() {
+ if item == nil {
+ continue
+ }
for _, field := range item.Fields(nil) {
+ if field == nil || field.Value() == nil || field.Value().Value() == nil {
+ continue
+ }
// existing code
}
}
Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
for _, item := range property.Items() { | |
for _, field := range item.Fields(nil) { | |
if field.GuessSchema().ID().String() == "buttonIcon" { | |
u, ok := field.Value().Value().(*url.URL) | |
if !ok { | |
continue | |
} | |
err := i.addZipAsset(ctx, zipWriter, u.Path) | |
if err != nil { | |
return nil, nil, err | |
} | |
} | |
} | |
} | |
} | |
for _, item := range property.Items() { | |
if item == nil { | |
continue | |
} | |
for _, field := range item.Fields(nil) { | |
if field == nil || field.Value() == nil || field.Value().Value() == nil { | |
continue | |
} | |
if field.GuessSchema().ID().String() == "buttonIcon" { | |
u, ok := field.Value().Value().(*url.URL) | |
if !ok { | |
continue | |
} | |
err := i.addZipAsset(ctx, zipWriter, u.Path) | |
if err != nil { | |
return nil, nil, err | |
} | |
} | |
} | |
} | |
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 8
Outside diff range and nitpick comments (1)
server/pkg/scene/builder/decoder.go (1)
153-154
: Remove commented-out code to enhance code cleanlinessThe commented line
// Links(flinks).
appears to be obsolete or unused.Removing such commented-out code improves readability and reduces confusion for future maintainers.
Apply this diff to remove the commented code:
field := property.NewField(*fieldID). Value(ov). - // Links(flinks). Build()
Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Files selected for processing (6)
- server/internal/adapter/gql/resolver_mutation_project.go (3 hunks)
- server/internal/usecase/interactor/scene.go (10 hunks)
- server/internal/usecase/interactor/style.go (3 hunks)
- server/internal/usecase/interfaces/scene.go (2 hunks)
- server/pkg/file/zip.go (2 hunks)
- server/pkg/scene/builder/decoder.go (2 hunks)
Files skipped from review as they are similar to previous changes (1)
- server/internal/usecase/interactor/style.go
Additional comments not posted (7)
server/internal/usecase/interfaces/scene.go (2)
4-10
: LGTM!The new imports
archive/zip
andgithub.com/reearth/reearth/server/pkg/plugin
have been added correctly. This aligns with the PR objective of introducing ZIP file export functionality and indicates that the code will likely involve working with ZIP files and plugins.
36-37
: Verify the method implementations.The new
ExportScene
method and the updatedImportScene
method signatures in theScene
interface look good and align with the PR objectives.However, please ensure that:
- The actual implementations of these methods correctly handle the parameters and return values as per the updated signatures.
- The
ImportScene
method implementations are updated to handle the new[]*plugin.Plugin
parameter, as this is a breaking change.Run the following script to verify the method implementations:
Verification successful
Method implementations verified and aligned with interface.
The ExportScene and ImportScene method implementations have been found in
server/internal/usecase/interactor/scene.go
. Both implementations align with the interface signatures we reviewed earlier:
ExportScene:
func (i *Scene) ExportScene(ctx context.Context, prj *project.Project, zipWriter *zip.Writer) (*scene.Scene, map[string]interface{}, error)
ImportScene:
func (i *Scene) ImportScene(ctx context.Context, prj *project.Project, plgs []*plugin.Plugin, sceneData map[string]interface{}) (*scene.Scene, error)
The ImportScene method has been correctly updated to include the new
[]*plugin.Plugin
parameter, addressing the breaking change we noted earlier.While the signatures match the interface, I recommend a manual review of the implementation details to ensure correct handling of parameters and return values, especially for the new plugin parameter in ImportScene.
Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Description: Verify the ExportScene and ImportScene method implementations. # Test 1: Search for the ExportScene method implementation. # Expect: The implementation should exist and align with the interface signature. ast-grep --lang go --pattern $'func $_($_, $_, $_) ($_, $_, $_) { $$$ }' | rg --type go $'ExportScene' # Test 2: Search for the ImportScene method implementation. # Expect: The implementation should exist and align with the updated interface signature. ast-grep --lang go --pattern $'func $_($_, $_, $_, $_) ($_, $_) { $$$ }' | rg --type go $'ImportScene'Length of output: 205
Script:
#!/bin/bash # Description: Broader search for ExportScene and ImportScene implementations # Test 1: Search for any occurrences of ExportScene in Go files echo "Searching for ExportScene:" rg --type go "ExportScene" # Test 2: Search for any occurrences of ImportScene in Go files echo "Searching for ImportScene:" rg --type go "ImportScene" # Test 3: List Go files in the project echo "Listing Go files:" fd -e go # Test 4: Search for potential implementation files echo "Searching for potential implementation files:" fd -e go | rg "scene|usecase"Length of output: 35601
server/internal/adapter/gql/resolver_mutation_project.go (2)
131-138
: ** Issue: Deleting the exported ZIP file before it can be downloaded**The deferred function removes the ZIP file immediately after
ExportProject
returns. Since theProjectDataPath
returned points to this ZIP file, deleting it prematurely will prevent clients from accessing the file.To fix this issue, consider the following options:
Option 1: Remove the
os.Remove(zipFile.Name())
call from the deferred function to allow the file to persist. Implement a separate cleanup mechanism, such as a scheduled task, to remove old export files after a certain period.Option 2: Move the file deletion logic to occur after the file has been successfully downloaded by the client.
247-257
: ** Ensure transactions are properly rolled back on import errors**After importing the scene, NLS layers, styles, and storytelling data, if any of these operations fail, the transaction may still be committed, potentially leaving the database in an inconsistent state.
Modify the error handling to roll back the transaction if any import operation fails:
if err != nil { tx.End(ctx) // Rollback return nil, err } // Commit only if all operations succeed if err := tx.Commit(); err != nil { return nil, err }This ensures data integrity by preventing partial imports.
server/internal/usecase/interactor/scene.go (2)
598-716
: LGTM!The
ExportScene
function looks good and follows a clear logic to export a scene to a ZIP file. It retrieves all the necessary data, builds a JSON representation using a builder pattern, and adds asset files to the ZIP archive. Error handling is done consistently throughout the function.The introduction of this feature for exporting scenes as ZIP files could be useful for various purposes such as backups, sharing, or migration.
Line range hint
718-841
: LGTM!The modifications to the
ImportScene
function look good. The addition of theplgs
parameter allows for including plugins during scene import, which enhances the functionality and flexibility. The parsing of the scene JSON, creation of scene entities, and saving of the imported scene are handled properly. Error handling is done consistently throughout the function.The changes don't seem to have any negative impact on existing functionalities.
server/pkg/scene/builder/decoder.go (1)
177-184
: New functionparsePropertyOptionalValue
enhances modularityThe addition of
parsePropertyOptionalValue
correctly encapsulates the logic for parsing optional property values, improving the modularity of the code.
fileBytes, err := io.ReadAll(file) | ||
if err != nil { | ||
return nil, nil, nil, err | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Consider streaming the data instead of reading it into memory.
Reading the entire file into memory could cause issues for large files. Consider using io.Copy
to stream the data directly from the io.ReadSeeker
to the zip.NewReader
like this:
reader, err := zip.NewReader(file, file.Size())
if err != nil {
return nil, nil, nil, err
}
defer func(rc io.ReadCloser) { | ||
if cerr := rc.Close(); cerr != nil { | ||
fmt.Printf("Error closing file: %v\n", cerr) | ||
} | ||
}(rc) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Move the defer
statement outside the loop.
Using defer
inside the loop is inefficient as it will create a new deferred call for each iteration. Move the defer
statement outside the loop:
rc, err := file.Open()
if err != nil {
return nil, nil, nil, err
}
defer func(rc io.ReadCloser) {
if cerr := rc.Close(); cerr != nil {
fmt.Printf("Error closing file: %v\n", cerr)
}
}(rc)
data, err = io.ReadAll(rc)
if err != nil {
return nil, nil, nil, err
}
if file.Name == "project.json" { | ||
rc, err := file.Open() | ||
if err != nil { | ||
return nil, nil, nil, err | ||
} | ||
defer func(rc io.ReadCloser) { | ||
if cerr := rc.Close(); cerr != nil { | ||
fmt.Printf("Error closing file: %v\n", cerr) | ||
} | ||
}(rc) | ||
data, err = io.ReadAll(rc) | ||
if err != nil { | ||
return nil, nil, nil, err | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Handle the case where project.json
is missing.
The function assumes that project.json
exists in the zip file. It should return an error if this file is missing:
var data []byte
var projectJSONFound bool
for _, file := range reader.File {
if file.Name == "project.json" {
projectJSONFound = true
// ... read project.json data ...
}
// ... handle other files ...
}
if !projectJSONFound {
return nil, nil, nil, fmt.Errorf("project.json not found in the zip file")
}
func UncompressExportZip(file io.ReadSeeker) ([]byte, map[string]*zip.File, map[string]*zip.File, error) { | ||
fileBytes, err := io.ReadAll(file) | ||
if err != nil { | ||
return nil, nil, nil, err | ||
} | ||
reader, err := zip.NewReader(bytes.NewReader(fileBytes), int64(len(fileBytes))) | ||
if err != nil { | ||
return nil, nil, nil, err | ||
} | ||
var data []byte | ||
assets := make(map[string]*zip.File) | ||
plugins := make(map[string]*zip.File) | ||
for _, file := range reader.File { | ||
if file.Name == "project.json" { | ||
rc, err := file.Open() | ||
if err != nil { | ||
return nil, nil, nil, err | ||
} | ||
defer func(rc io.ReadCloser) { | ||
if cerr := rc.Close(); cerr != nil { | ||
fmt.Printf("Error closing file: %v\n", cerr) | ||
} | ||
}(rc) | ||
data, err = io.ReadAll(rc) | ||
if err != nil { | ||
return nil, nil, nil, err | ||
} | ||
} else if strings.HasPrefix(file.Name, "assets/") { | ||
trimmedName := strings.TrimPrefix(file.Name, "assets/") | ||
assets[trimmedName] = file | ||
} else if strings.HasPrefix(file.Name, "plugins/") { | ||
trimmedName := strings.TrimPrefix(file.Name, "plugins/") | ||
plugins[trimmedName] = file | ||
} | ||
} | ||
return data, assets, plugins, nil | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Validate the zip file structure and return an error if it's invalid.
The function assumes a specific structure for the zip file. It should validate this structure and return an error if it's invalid. For example:
for _, file := range reader.File {
if !strings.HasPrefix(file.Name, "assets/") &&
!strings.HasPrefix(file.Name, "plugins/") &&
file.Name != "project.json" {
return nil, nil, nil, fmt.Errorf("invalid file in zip: %s", file.Name)
}
// ... handle files ...
}
} else if strings.HasPrefix(file.Name, "assets/") { | ||
trimmedName := strings.TrimPrefix(file.Name, "assets/") | ||
assets[trimmedName] = file | ||
} else if strings.HasPrefix(file.Name, "plugins/") { | ||
trimmedName := strings.TrimPrefix(file.Name, "plugins/") | ||
plugins[trimmedName] = file | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Close the readers for asset and plugin files.
The function doesn't close the readers it opens for asset and plugin files. This could lead to resource leaks. Use defer
to ensure these readers are always closed:
} else if strings.HasPrefix(file.Name, "assets/") {
rc, err := file.Open()
if err != nil {
return nil, nil, nil, err
}
defer rc.Close()
trimmedName := strings.TrimPrefix(file.Name, "assets/")
assets[trimmedName] = file
} else if strings.HasPrefix(file.Name, "plugins/") {
rc, err := file.Open()
if err != nil {
return nil, nil, nil, err
}
defer rc.Close()
trimmedName := strings.TrimPrefix(file.Name, "plugins/")
plugins[trimmedName] = file
}
if ok && ps != nil { | ||
_, _, _, err := prop.UpdateValue(ps, ptr, pv) | ||
if err != nil { | ||
return nil, err | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ensure explicit error handling when property schema ps
is nil
In the condition if ok && ps != nil
, if ps
is nil
, the property update is silently skipped. This could lead to inconsistencies or unexpected behavior.
Consider returning an explicit error when ps
is nil
to prevent silent failures and make debugging easier.
for fieldKey, value := range items { | ||
if fieldKey == "id" { | ||
continue | ||
} | ||
ov, ok := parsePropertyOptionalValue(value) | ||
if ok { | ||
fieldID := id.PropertyFieldIDFromRef(&fieldKey) | ||
field := property.NewField(*fieldID). | ||
Value(ov). | ||
// Links(flinks). | ||
Build() | ||
g.AddFields(field) | ||
} | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Handle parsing failures explicitly when parsing group property fields
Currently, if parsePropertyOptionalValue(value)
returns false
, the field is silently skipped. This means any parsing errors won't be reported.
Consider logging a warning or error when parsing fails to ensure that all intended fields are processed and to aid in debugging.
for sgKey, value := range pj { | ||
|
||
if items, ok := value.(map[string]interface{}); ok { | ||
// simple property | ||
|
||
sgID := id.PropertySchemaGroupIDFromRef(&sgKey) | ||
|
||
for fieldKey, value := range items { | ||
|
||
fieldID := id.PropertyFieldIDFromRef(&fieldKey) | ||
ptr := property.NewPointer(sgID, nil, fieldID) | ||
pv, ok := parsePropertyValue(value) | ||
|
||
if ok && ps != nil { | ||
_, _, _, err := prop.UpdateValue(ps, ptr, pv) | ||
if err != nil { | ||
return nil, err | ||
} | ||
} | ||
} | ||
|
||
} else if arrayProperty, ok := value.([]interface{}); ok { | ||
// group property | ||
|
||
for _, groupProperty := range arrayProperty { | ||
|
||
sg := id.PropertySchemaGroupID(sgKey) | ||
gl := prop.GetOrCreateGroupList(ps, property.PointItemBySchema(sg)) | ||
g := property.NewGroup().NewID().SchemaGroup(sg).MustBuild() | ||
gl.Add(g, -1) | ||
|
||
if items, ok := groupProperty.(map[string]interface{}); ok { | ||
|
||
for fieldKey, value := range items { | ||
if fieldKey == "id" { | ||
continue | ||
} | ||
ov, ok := parsePropertyOptionalValue(value) | ||
if ok { | ||
fieldID := id.PropertyFieldIDFromRef(&fieldKey) | ||
field := property.NewField(*fieldID). | ||
Value(ov). | ||
// Links(flinks). | ||
Build() | ||
g.AddFields(field) | ||
} | ||
} | ||
} | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Refactor to reduce duplication in property processing logic
The loops handling simple properties (lines 114–132) and group properties (lines 133–160) contain similar code for iterating over fields and parsing values.
Consider extracting common functionality into helper functions to reduce code duplication and improve maintainability.
Overview
Changed from the previous JSON data-only output to downloading as a ZIP file that includes file resources.
What I've done
Contents of the ZIP file:
*project.json is the previous JSON file.
Add the export domain to the ZIP file storage.
*Files will be deleted simultaneously with the download.
Change the response from JSON to the download path of the ZIP file.
Added processing to save file resources.
File names will be updated.
Modified the response upon import to return the database data for verification.
Refactor part of the previous code.
In ExportProject(), data for Scenes and Plugins was being obtained, but I have now created ExportScene() and ExportPlugins() to separate the processing.
What I haven't done
The permission checks, such as access rights for each process, are not sufficient.
How I tested
Which point I want you to review particularly
Memo
Summary by CodeRabbit
Release Notes
New Features
Improvements
Bug Fixes