-
I recommend watching the first few minutes of this video by Alexey for understanding how it's done. You can then follow the below steps.
-
Create an ssh key in your local system in the
.ssh
folder - Guide -
Add the public key (
.pub
) to your VM instance - Guide -
Create a config file in your
.ssh
foldertouch ~/.ssh/config
-
Copy the following snippet and replace with External IP of the Kafka, Spark (Master Node), Airflow VMs. Username and path to the ssh private key
Host streamify-kafka HostName <External IP Address> User <username> IdentityFile <path/to/home/.ssh/keyfile> Host streamify-spark HostName <External IP Address Of Master Node> User <username> IdentityFile <path/to/home/.ssh/keyfile> Host streamify-airflow HostName <External IP Address> User <username> IdentityFile <path/to/home/.ssh/gcp>
-
Once you are setup, you can simply SSH into the servers using the below commands in separate terminals. Do not forget to change the IP address of VM restarts.
ssh streamify-kafka
ssh streamify-spark
ssh streamify-airflow
-
You will have to forward ports from your VM to your local machine for you to be able to see Kafka, Airflow UI. Check how to do that here