FAQ
Frequently ask questions
Last updated
Frequently ask questions
Last updated
This is a project to bring free and convenient technology to the disabled community. Imagine a mute person who can give an eloquent presentation on the stage using sign language. Or a deaf person can understand what you say immediately through sign language
We use the Machine Learning model with the "Image Classification" problem to be able to recognize hand gestures. We trained the model for over 1000+ hours on over 15000+ different hand gesture photos, in every angle and lighting condition. More details .
It is words that you can say with only one hand, for example the words of the alphabet (A, B, C, D .etc) or some words like "i love you", "thank you", "hello" .etc.
These are words that you have to use both hands to express their meaning.