Skip to content

Commit 103332d

Browse files
committed
2 parents 96e89ad + c23d238 commit 103332d

File tree

1 file changed

+8
-8
lines changed

1 file changed

+8
-8
lines changed

Diff for: README.md

+8-8
Original file line numberDiff line numberDiff line change
@@ -47,22 +47,22 @@ We wanted to make it easy for 70 million deaf people across the world to be inde
4747

4848
## Setup
4949

50-
* Use comand promt to setup environment by using requirements_cpu.txt and requirements_gpu.txt files.
50+
* Use comand promt to setup environment by using install_packages.txt and install_packages_gpu.txt files.
5151

52-
`pyton -m pip r using requirements_cpu.txt`
52+
`pyton -m pip r install_packages.txt`
5353

5454
This will help you in installing all the libraries required for the project.
5555

5656
## Process
5757

58-
* Run `set_hand_hist.py` to set the hand histogram for creating gestures.
58+
* Run `set_hand_histogram.py` to set the hand histogram for creating gestures.
5959
* Once you get a good histogram, save it in the code folder, or you can use the histogram created by us that can be found [here](https://github.com/harshbg/Sign-Language-Interpreter-using-Deep-Learning/blob/master/Code/hist).
6060
* Added gestures and label them using OpenCV which uses webcam feed. by running `create_gestures.py` and stores them in a database. Alternately, you can use the gestures created by us [here](https://github.com/harshbg/Sign-Language-Interpreter-using-Deep-Learning/tree/master/Code).
61-
* Add different variations to the captured gestures by flipping all the images by using `flip_images.py`.
61+
* Add different variations to the captured gestures by flipping all the images by using `Rotate_images.py`.
6262
* Run `load_images.py` to split all the captured gestures into training, validation and test set.
63-
* To view all the gestures, run `display_all_gestures.py` .
64-
* Train the model using Keras by running `cnn_keras.py`.
65-
* Run `fun_util.py`. This will open up the gesture recognition window which will use your webcam to interpret the trained American Sign Language gestures.
63+
* To view all the gestures, run `display_gestures.py` .
64+
* Train the model using Keras by running `cnn_model_train.py`.
65+
* Run `final.py`. This will open up the gesture recognition window which will use your webcam to interpret the trained American Sign Language gestures.
6666

6767
## Code Examples
6868

@@ -167,4 +167,4 @@ If you loved what you read here and feel like we can collaborate to produce some
167167
just want to shoot a question, please feel free to connect with me on <a href="[email protected]" target="_blank">email</a>,
168168
<a href="http://bit.ly/2uOIUeo" target="_blank">LinkedIn</a>, or
169169
<a href="http://bit.ly/2CZv1i5" target="_blank">Twitter</a>.
170-
My other projects can be found [here](http://bit.ly/2UlyFgC).
170+
My other projects can be found [here](http://bit.ly/2UlyFgC).

0 commit comments

Comments
 (0)