Skip to content

Latest commit

 

History

History
37 lines (23 loc) · 1.31 KB

File metadata and controls

37 lines (23 loc) · 1.31 KB

gallery gallery (1)

About the Project

CLIF (Communicative Learning Intelligent Friend) facilitates communication for mute and deaf individuals. By leveraging computer vision and machine learning, CLIF interprets sign language gestures captured by a camera in real-time. It then translates these gestures into text and speech, which are relayed through a speaker and LCD. This device bridges the barrier between signed language and spoken language, empowering individuals with hearing and speech impairments to engage more frequently with others.

Link to Devpost: https://devpost.com/software/clif-cj1ql7?ref_content=my-projects-tab&ref_feature=my_projects

Built With

Software:

  • C++
  • Python
  • OpenCV
  • MediaPipe
  • Scikit-Learn

Hardware:

  • Arduino Uno
  • Wireless Speaker
  • LCD

Getting Started

Prerequisites

None! CLIF runs entirely on its own, as long as it is connected to a power source.

Installing & Executing Program

CLIF requires no additional programs to be installed. Simply turn on the device and watch it go to work.

Creators

Charles Eret, Brandon Kong, Omer Sajid, Shawn Yang, Sean Zhang