The SFC Data Exchange API is a project that aims to enable organisations in the freight industry share carbon footprint data in a decentralized and consistent way
- Participant share data in a decentralized way
- Enforces a data model that ensures data consistency
- Share Large Data sets
- Data Provider can revoke acecss
- Data Consumer can view metadata of the carbon footprints he has access to
Uses Koa and TypeScript for improved performance and type safety Implements a simple RESTful API with CRUD operations for app resources
To install the project, follow these steps:
- Clone the repository using HTTPS or SSH
- Install dependencies
yarn
The project is dependent on KeyCloak , Redis and the EDC Connector
You can spin up all those dependencies by running
docker-compose up
View Postman Docs here
Running the docker-compose up
command will start:
- redis-server
- keycloak-server
- edc-manager
Before starting the development servers, create a .envrc
file with the following content:
export SFCAPI_BASEURL='<ask your team for the sfcapi_baseurl>'
export REDIS_URL='<ask your team for the redis_url>'
export REDIS_HOST='<ask your team for the redis host>'
export REDIS_PORT='<ask your team for the redis port>'
export REDIS_DATABASE='<ask your team for the redis database'>
export SUPPORTED_EDC_FILTER_OPERATORS='='
export CR_PAT='<ask your team for the cr_pat>'
export CONSUMER_CONNECTOR_CONFIG='<ask your team for the config>'
export PROVIDER_CONNECTOR_CONFIG='<ask your team for the config>'
export KEYCLOAK_HOST='<ask your team for the keycloak host>'
export KEYCLOAK_PORT='<ask your team for the keycloak port>'
export KEYCLOAK_ADMIN="<set the keycloak admin username>"
export KEYCLOAK_ADMIN_PASSWORD='<set the keycloak admin password>'
export AWS_REGION="<ask your team for the aws region>"
export AWS_ACCESS_ID="<ask your team for the aws access id>"
export AWS_SECRET="<ask your team for the aws secret>"
export PROVIDER_KEYCLOAK_PUBLIC_KEY='<get the public key from keycloak>'
export CONSUMER_KEYCLOAK_PUBLIC_KEY='<get the public key from keycloak>'
**The MVP version aims to ensure that all participants use a consistent and interoperable framework, allowing for seamless data exchange and collaboration.
Before starting, make sure you have adequate access to AWS services. If you need to request access, please reach out to AWS owners. Your direct lead can help you too.
You need to retrieve the required environment variables from the project lead or from other team members. You'll have to define these environment variables in your shell or use a tool like direnv that can load and unload environment variables depending on the current directory.
It is necessary to spin up all services to start the development process effectively.
Open three terminal windows and enter the following commands:
docker-compose up # start docker containers for redis-server, keycloak and both edc provider and consumer connector
yarn dev:provider # starts the development `provider` server
yarn dev:consumer # starts the development `consumer` server
Also start the sfc-unit after cloning and installing it from sfc-unit repo
yarn dev
see Component Level Architecture diagram
The entire system is built on the Node.js® runtime and written in TypeScript.
The core
library tries to have the least number of dependencies. The two important ones are the services and the usecases.
The api
service aims to be a thin integration layer between the core
library and the external enpoint. Here the server is a slim Koa instance.
The Node environment is eventually wrapped in a Docker image based on Alpine Linux.
Each commit is pushed from a feature branch will activate a pipeline that incudes 2 stages Lint and Test.
Once a merge request is accepted to the main branch will provoke 3 jobs Lint, Test, Development and Production.
The sfc-backend uses Github Actions and Github Secrets for the CI/CD. Available workflows:
- Triggered by: issue or issue_comment creation
- Watches: all files.
- Steps
- Update changes in Notion board
- Triggered by: on any push to branch main
- Watches: all files.
- Steps
- Checkout repository
- Login to ECR
- Build image and push it to AWS ECR
- Triggered by: any push to all branches
- Watches: all files.
- Steps
- Setup Node.js
- Install dependencies using Yarn.
- Run unit tests.
- Run integration tests.
- Check liniting.