The log transformation is special

Department of Medical Statistics, Glaxo Research and Development Ltd., Greenford, Middlesex, U.K.
Statistics in Medicine (Impact Factor: 2.04). 04/1995; 14(8):811-9. DOI: 10.1002/sim.4780140810
Source: PubMed

ABSTRACT The logarithmic (log) transformation is a simple yet controversial step in the analysis of positive continuous data measured on an interval scale. Situations where a log transformation is indicated will be reviewed. This paper contends that the log transformation should not be classed with other transformations as it has particular advantages. Problems with using the data themselves to decide whether or not to transform will be discussed. It is recommended that log transformed analyses should frequently be preferred to untransformed analyses and that careful consideration should be given to use of a log transformation at the protocol design stage.

  • Source
    • "n/N control; n/N HF). Continuous scale data was log 10 transformed [38] and differences between control and HF animals determined using linear mixed modelling to account for instances where multiple cellular observations (n) were obtained from each experimental subject (N) (IBM SPSS Statistics v20). Where treatments/effects were within the same animal, cellular differences were assessed using a paired Students t-test and considered significant when P b 0.05. "
    [Show abstract] [Hide abstract]
    ABSTRACT: Heart failure (HF) is commonly associated with reduced cardiac output and an increased risk of atrial arrhythmias particularly during β-adrenergic stimulation. The aim of the present study was to determine how HF alters systolic Ca2 + and the response to β-adrenergic (β-AR) stimulation in atrial myocytes. HF was induced in sheep by ventricular tachypacing and changes in intracellular Ca2 + concentration studied in single left atrial myocytes under voltage and current clamp conditions. The following were all reduced in HF atrial myocytes; Ca2 + transient amplitude (by 46 % in current clamped and 28 % in voltage clamped cells), SR dependent rate of Ca2 + removal (kSR, by 32 %), L-type Ca2 + current density (by 36 %) and action potential duration (APD90 by 22 %). However, in HF SR Ca2 + content was increased (by 19 %) when measured under voltage-clamp stimulation. Inhibiting the L-type Ca2 + current (ICa-L) in control cells reproduced both the decrease in Ca2 + transient amplitude and increase of SR Ca2 + content observed in voltage-clamped HF cells. During β-AR stimulation Ca2 + transient amplitude was the same in control and HF cells. However, ICa-L remained less in HF than control cells whilst SR Ca2 + content was highest in HF cells during β-AR stimulation. The decrease in ICa-L that occurs in HF atrial myocytes appears to underpin the decreased Ca2 + transient amplitude and increased SR Ca2 + content observed in voltage-clamped cells.
    Journal of Molecular and Cellular Cardiology 11/2014; 79. DOI:10.1016/j.yjmcc.2014.11.017 · 5.22 Impact Factor
  • Source
    • "We used the natural log transformation for fines + 0 . 05 and LWD + 1 ( Keene , 1995 ) . All variables were centered by subtracting the average value within their respective precipitation group . "
    [Show abstract] [Hide abstract]
    ABSTRACT: Conceptual models are an integral facet of long-term monitoring programs. Proposed linkages between drivers, stressors, and ecological indicators are identified within the conceptual model of most mandated programs. We empirically evaluate a conceptual model developed for a regional aquatic and riparian monitoring program using causal models (i.e., Bayesian path analysis). We assess whether data gathered for regional status and trend estimation can also provide insights on why a stream may deviate from reference conditions. We target the hypothesized causal pathways for how anthropogenic drivers of road density, percent grazing, and percent forest within a catchment affect instream biological condition. We found instream temperature and fine sediments in arid sites and only fine sediments in mesic sites accounted for a significant portion of the maximum possible variation explainable in biological condition among managed sites. However, the biological significance of the direct effects of anthropogenic drivers on instream temperature and fine sediments were minimal or not detected. Consequently, there was weak to no biological support for causal pathways related to anthropogenic drivers’ impact on biological condition. With weak biological and statistical effect sizes, ignoring environmental contextual variables and covariates that explain natural heterogeneity would have resulted in no evidence of human impacts on biological integrity in some instances. For programs targeting the effects of anthropogenic activities, it is imperative to identify both land use practices and mechanisms that have led to degraded conditions (i.e., moving beyond simple status and trend estimation). Our empirical evaluation of the conceptual model underpinning the long-term monitoring program provided an opportunity for learning and, consequently, we discuss survey design elements that require modification to achieve question driven monitoring, a necessary step in the practice of adaptive monitoring. We suspect our situation is not unique and many programs may suffer from the same inferential disconnect. Commonly, the survey design is optimized for robust estimates of regional status and trend detection and not necessarily to provide statistical inferences on the causal mechanisms outlined in the conceptual model, even though these relationships are typically used to justify and promote the long-term monitoring of a chosen ecological indicator. Our application demonstrates a process for empirical evaluation of conceptual models and exemplifies the need for such interim assessments in order for programs to evolve and persist.
    Ecological Indicators 11/2014; 50. DOI:10.1016/j.ecolind.2014.10.011 · 3.23 Impact Factor
  • Source
    • "Parametric statistical tests were conducted on log(meta-d'/d'). A log-transformation weights observations automatically to a ratio scale (Keene, 1995), thus ascribing equal weight to increases and decreases in meta-d'/d' relative to a theoretically ideal value of 1. Meta-d' is theoretically bounded below by zero, but when fit using "
    [Show abstract] [Hide abstract]
    ABSTRACT: Humans have the capacity to evaluate the success of cognitive processes, known as metacognition. Convergent evidence supports a role for anterior prefrontal cortex in metacognitive judgements of perceptual processes. However, it is unknown whether metacognition is a global phenomenon, with anterior prefrontal cortex supporting metacognition across domains, or whether it relies on domain-specific neural substrates. To address this question, we measured metacognitive accuracy in patients with lesions to anterior prefrontal cortex (n = 7) in two distinct domains, perception and memory, by assessing the correspondence between objective performance and subjective ratings of performance. Despite performing equivalently to a comparison group with temporal lobe lesions (n = 11) and healthy controls (n = 19), patients with lesions to the anterior prefrontal cortex showed a selective deficit in perceptual metacognitive accuracy (meta-d'/d', 95% confidence interval 0.28-0.64). Crucially, however, the anterior prefrontal cortex lesion group's metacognitive accuracy on an equivalent memory task remained unimpaired (meta-d'/d', 95% confidence interval 0.78-1.29). Metacognitive accuracy in the temporal lobe group was intact in both domains. Our results support a causal role for anterior prefrontal cortex in perceptual metacognition, and indicate that the neural architecture of metacognition, while often considered global and domain-general, comprises domain-specific components that may be differentially affected by neurological insult.
    Brain 08/2014; 137(10). DOI:10.1093/brain/awu221 · 10.23 Impact Factor
Show more


Available from