In this work ,we focus ,on demonstrating, a real ,time communication interface which enhances text communication by detecting from real time typed text, the extracted emotions, and displaying on the screen appropriate facial expression images ,in real time. The displayed expressions are represented in terms of expressive ,images ,or sketches of ,the communicating ,persons. This interface makes ,use of a ,developed ,real time ,emotion ,extraction engine from text. The emotion extraction engine and extraction rules are discussed together with a description of the interface, its limits and future direction of such interface. The extracted emotions are mapped into displayed facial expressions. Such interface can be used ,as a ,platform ,for a number ,of future ,CMC experiments. The developed ,online communication ,interface brings together remotely located collaborating parties in a ,shared electronic space for their communication. In its current state the interface allows the participant to see at a glance,all other online participants and ,all those who ,are engaged ,in communications.,An important ,aspect of the ,interface is that for two users engaged in communication, the interface locally extracts emotional states from the content of typed ,textual sentences automatically. Subsequently it displays discrete expressions mapped,from extracted emotions to the remote screen of the other person. It also analyses/extracts the ,intensity/duration of ,the emotional