-
Notifications
You must be signed in to change notification settings - Fork 0
Description
Note for reviewers
- I use this issue instead of PR to collect everything about this task and all referenced PR are merged already.
- I use single monorepo with both BE and FE for a good reason: FE build uses outputs of BE stacks. Organizers have confirmed that it is acceptable.
- Check Task 2. Serve SPA in AWS S3 and Cloudfront Services #3, Task 3. First API with AWS API Gateway and AWS Lambda #18, Task 4. Integration with DynamoDB #22 and commit history if you want to learn more about setup and reasoning. Issues, PRs and each commit messages contain lot of details.
- You can skip review of repo preparation and extra efforts. Listing it here just in case you might be interested.
- Please provide your evaluation in comments and fill in the form.
Task 5 Artifacts
Task 5.1 (Commit 924322f)
✔️ Create a new service called import-service
at the same level as Product Service.
✔️ Reuse S3 bucket for fixtures for uploaded data
Task 5.2 (Pull Request #25)
✔️ Create a lambda function importProductsFile
which will be triggered by the HTTP GET method.
✔️ The requested URL is /import
.
✔️ Implement its logic so it will be expecting a request with a name of CSV file with products and creating a new Signed URL.
✔️ The name will be passed in a query string as a name parameter and should be described in the serverless config file as a request parameter.
✔️ Add policies to allow lambda functions to interact with S3.
✔️ The response from the lambda should be the created Signed URL.
✔️ The lambda endpoint is integrated with the frontend.
Extra notes:
- New service reuse same existing api gateway
- Validation error with status code 400 returned in case
name
wasn't provided - Filename suffixed with timestamp to avoid existing files overwrite
Task 5.3 (Pull Request #26)
✔️ Create a lambda function importFileParser
which will be triggered by an S3 event.
✔️ The event should be s3:ObjectCreated:*
✔️ Configure the event to be fired only by changes in the uploaded folder in S3.
✔️ The lambda function use a readable stream to get an object from S3, parse it using csv-parser
package and logs each record to be shown in CloudWatch.
Additional tasks:
➕ Business logic of importProductsFile
lambda is covered by unit tests
➕ At the end of the stream the lambda function moves the file from the uploaded
folder into the parsed
folder
Links to deployment
Deployments links removed since its outdated and there are newer tasks implemented.