Eye tracking is one of the most natural ways for people with amyotrophic lateral sclerosis and other locked-in and paralysis diseases to communicate. The majority of existing eye-tracking computer input systems use cameras to capture images of the user's eyes to track pupil movements. Most camera-based systems are expensive and not user-friendly. This paper proposes an eye-tracking system called EyeLive that uses discrete infrared sensors and emitters as the input device. The eye-tracking and calibration algorithms classify eye positions into six categories, namely looking up, looking down, looking left, looking right, looking straight ahead (i.e. middle direction), and eyes closed. A graphical user interface optimized for EyeLive is also developed. It divides the screen into a nine-cell grid and uses a hierarchical selection approach for text input. EyeLive's hardware, eye-tracking algorithm, calibration, and user interface are compared to those of existing eye-tracking systems. The advantages of the EyeLive system, such as low cost, user friendliness, and eye strain reduction, are discussed. The performance of the system in classifying eye positions is experimentally tested with eight healthy individuals. The results show a 5.6% error rate in classifying eye positions in five directions and a 0% error rate in classifying closed eyes. Additional experiments show that the average typing speed is 1.95 words/min for a novice user and 2.91 words/min for an experienced user. The tradeoffs of lower typing speed versus other advantages comparing to other systems are explained.