[Show abstract][Hide abstract] ABSTRACT: The study of QT dispersion (QTd) is of increasing clinical interest, but there are very few data in large healthy populations. Furthermore, there is still discussion on the extent to which QTd reflects dispersion of measurement. This study addresses these problems.
Twelve-lead ECGs recorded on 1501 apparently healthy adults and 1784 healthy neonates, infants, and children were used to derive normal limits of QTd and QT intervals by use of a fully automated approach. No age gradient or sex differences in QTd were seen and it was found that an upper limit of 50 ms was highly specific. Three-orthogonal-lead ECGs (n=1220) from the Common Standards for Quantitative Electrocardiography database were used to generate derived 12-lead ECGs, which had a significant increase in QTd of 10.1+/-13.1 ms compared with the original orthogonal-lead ECG but a mean difference of only 1.63+/-12.2 ms compared with the original 12-lead ECGs. In a population of 361 patients with old myocardial infarction, there was a statistically significant increase in mean QTd compared with that of the adult normal group (32.7+/-10.0 versus 24.53+/-8.2 ms; P<0. 0001). An estimate of computer measurement error was also obtained by creating 2 sets of 1220 ECGs from the original set of 1220. The mean error (difference in QTd on a paired basis) was found to be 0. 28+/-9.7 ms.
These data indicate that QTd is age and sex independent, has a highly specific upper normal limit of 50 ms, is significantly lower in the 3-orthogonal-lead than in the 12-lead ECG, and is longer in patients with a previous myocardial infarction than in normal subjects.
[Show abstract][Hide abstract] ABSTRACT: The coefficient of variation is a popular measure for describing the amount of repeat variability present in ECG measurements from recording to recording. However, it can be misleading. The aim of the present study was to assess repeat variation (reclassification) in computer measured ECG criteria, i.e. positive to negative or vice versa, and compare this with the coefficient of variability.
Two ECGs were obtained from each of 295 patients, one day apart, and separately from a further 364 patients, several minutes apart. All patients were considered to be in a stable condition. Estimates of the coefficients of variation were obtained for a number of ECG parameters used in the diagnosis of left ventricular hypertrophy. Corresponding reclassification rates of relevant ECG criteria were also calculated. Large coefficients of variation were observed in voltage parameters, e.g. R in V5 (20% for day-to-day recordings and 6% for minute-to-minute recordings) while the corresponding reclassification rates were 8% and 0% respectively. The repeat variation in the diagnosis of left ventricular hypertrophy was up to 5% for day-to-day recordings and up to 3% for minute-to-minute recordings based on several different criteria.
A large coefficient of variation in a particular variable does not necessarily correspond to a high reclassification rate. A better measure of the impact of ECG variability for a particular measurement is obtained from its reclassification rate. In turn, this may have a minimal effect on the overall diagnosis of a particular abnormality.
European Heart Journal 03/1998; 19(2):342-51. · 15.20 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: The techniques that improve the overall repeatability of computer interpretation of electrocardiograms (ECGs) that have been recorded several minutes apart from patients in a clinically stable condition are described. Estimates of the normal amounts of variability present in many ECG parameters that are used in the identification of a variety of cardiac abnormalities have been adopted in conjunction with smoothing techniques to form the basis of the new methodology. When applied to the Glasgow ECG analysis program, these new methods improve overall repeatability by about 31% when tested on a set of 263 pairs of ECGs. Randomly generated noise was added to the test set and an additional technique aimed at removing noise from the ECG tracings was used in conjunction with the smoothing methods. The observed improvement over the original repeatability was 63%.
[Show abstract][Hide abstract] ABSTRACT: Statistically-based smoothing techniques are described which have been applied to the existing framework of the Glasgow ECG Analysis program. These methods have been designed with the aim of improving repeatability in the computer interpretation of ECGs which have been recorded either several minutes or 24 hours apart from patients in a clinically stable condition. With respect to the ECG diagnosis of Left Ventricular Hypertrophy (LVH), these flexible methods have the effect to reducing the number of inconsistent day-to-day interpretations by 36% from 33 to 21 in 330 pairs of ECGs recorded one day apart. Similarly, when comparing agreement in the diagnosis of LVH in 249 pairs of ECGs which were recorded several minutes apart, the number of discordant computer interpretations was 6 using the new methodology, compared with 13 using conventional criteria, i.e. there was a 54% reduction in disagreements.
Methods of Information in Medicine 07/1995; 34(3):272-82. · 2.25 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: The effects of age, sex, and race on the electrocardiogram (ECG) were studied using three separate populations: a pediatric group of 1,782 neonates, infants, and children, and adult white group of 1,555 individuals, and an adult Chinese cohort of 503 individuals. All ECGs were processed using the same computer program, and various interval measurements were derived, including QRS duration, heart rate, QT dispersion, and selected Q-wave durations. Also, a small subgroup of 195 white subjects had a signal-averaged ECG recorded. In the pediatric group, there was a clear link between age and QRS duration, which increased linearly from about 1 year of age to adolescence. In the adults, the principal differences were an increased QRS duration in men compared with women both in the standard and signal-averaged ECG. Upper limits of normal heart rate also tended to be higher in women than in men in the two adult populations. Small racial differences could be seen in some measurements, but were not thought to be of clinical significance.
[Show abstract][Hide abstract] ABSTRACT: This study describes the implementation of novel techniques that have been designed with the aim of improving the repeatability of the diagnostic section of the Glasgow electrocardiographic (ECG) analysis program. Specific reference is made to the agreement in consecutive computer-assisted diagnoses of inferior myocardial infarction (IMI). Inherent repeat variation was estimated in ECG parameters of interest and used in conjunction with smoothing methods to produce a continuous Q-wave index ranging from 0 (no IMI) to 1 (IMI). A decision as to the presence or absence of IMI was then made on the basis of this smooth index. The sensitivity and specificity of the new approach remain unchanged from the conventional procedure when analyzing single ECGs. However, consistency in interpretation of day-to-day and minute-to-minute ECG interpretations was enhanced. Specific reference is made to the agreement between consecutive pairs of computer-assisted diagnoses of ECGs from the same patient with which one or both interpretations was that of IMI.
[Show abstract][Hide abstract] ABSTRACT: This study describes a method for improving the day-to-day and minute-to-minute repeatability of the deterministic computer-assisted diagnosis of left ventricular hypertrophy. Conventional upper limits of normal for many age-dependent electrocardiographic parameters have been replaced by continuous equations, thereby, eliminating points of discontinuity that can contribute to lack of repeatability. Estimates of the normal amounts of variability present in many electrocardiographic parameters that are used in the diagnosis of left ventricular hypertrophy are calculated. These estimates are then used together with a smoothed version of a score function to form the basis of the new technique. The implementation of smoothing techniques enhances the repeatability of the Glasgow electrocardiographic analysis program. With respect to the electrocardiographic diagnosis of left ventricular hypertrophy, these methods eliminate 44% of the day-to-day and 50% of the minute-to-minute inconsistencies in computer reports.
Journal of Electrocardiology 02/1993; 26 Suppl:101-7. · 1.36 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: This paper describes the methods currently used in Glasgow Royal Infirmary for computer analysis of electrocardiograms. The software is designed to analyse from 3 to 15 simultaneously recorded leads, with facilities for analysis of rhythm and serial changes. Options for Minnesota Code (with serial comparison) and XYZ lead interpretation are available.
Methods of Information in Medicine 10/1990; 29(4):354-61. · 2.25 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: A study of more than 1,780 neonates, infants, and children was carried out, using a digital electrocardiograph with a sampling rate of 500 per second, to revise the normal limits of the pediatric ECG. The 12-lead ECG was used with V4R replacing V3. All leads were recorded simultaneously off-line in digital form on magnetic tape and were subsequently analyzed using well-established computing techniques. The results showed that the upper 98 percentile limit of normal amplitudes could be up to 46% higher than previously published limits. Differences in some mean values were very much higher, though these are of less clinical significance. In addition, QRS durations were found to be wider than previously published data. Sex-related differences could be demonstrated in both amplitude and duration measurements, particularly in the early adolescent years. This study confirms that to record pediatric ECGs with high fidelity, it is necessary to use equipment that converts the ECG from analog to digital form at a rate of 500 samples/sec. Significant errors in amplitude and duration measurements may be expected if a much lower sampling rate is utilized.