Body gesture is the most important way of non-verbal communication for deaf and dumb people. Thus, a novel sign language recognition procedure is presented here where the movements of hands play a pivotal role for such kind of communications. Microsoft’s Kinect sensor is used to act as a medium to interpret such communication by tracking the movement of human body using 20 joints. A procedural approach has been developed to deal with unknown gesture recognition by generating in-order expression for AVL tree as a feature. Here, 12 gestures are taken into consideration, and for the classification purpose, kernel function-based support vector machine is employed with results to gesture recognition into an accuracy of 88.3%. The foremost goal is to develop an algorithm that act as a medium to human–computer interaction for deaf and dumb people. Here, the novelty lies in the fact that for gesture recognition in sign language interpretation, the whole body of the subject is represented using a hierarchical balanced tree (here AVL).