Harold Christopher Burger

Max Planck Institute for Biological Cybernetics, Tübingen, Baden-Württemberg, Germany

Are you Harold Christopher Burger?

Claim your profile

Publications (10)3.57 Total impact

  • [Show abstract] [Hide abstract]
    ABSTRACT: In this paper, a novel algorithm is proposed for automatic detection of snoring sounds from ambient acoustic data in a pediatric population. With the approval of institutional ethic committee and parents, the respiratory sounds of 50 subjects were recorded by using a pair of microphones and multichannel data acquisition system simultaneously with full-night polysomnography during sleep. Brief sound chunks of 0.5 s were classified as either belonging to a snoring event or not with a multi-layer perceptron which was trained in a supervised fashion using stochastic gradient descent on a large hand-labeled dataset using frequency domain features. The overall accuracy of the proposed algorithm was found to be 88.93% for primary snorers and 80.6% for obstructive sleep apnea (OSA) patients.
    No preview · Conference Paper · Apr 2014
  • Harold Christopher Burger · Christian Schuler · Stefan Harmeling
    [Show abstract] [Hide abstract]
    ABSTRACT: Different methods for image denoising have complementary strengths and can be combined to improve image denoising performance, as has been noted by several authors [11,7]. Mosseri et al. [11] distinguish between internal and external methods depending whether they exploit internal or external statistics [13]. They also propose a rule-based scheme (PatchSNR) to combine these two classes of algorithms. In this paper, we test the underlying assumptions and show that many images might not be easily split into regions where internal methods or external methods are preferable. Instead we propose a learning based approach using a neural network, that automatically combines denoising results from an internal and from an external method. This approach outperforms both other combination methods and state-of-the-art stand-alone image denoising methods, hereby further closing the gap to the theoretically achievable performance limits of denoising [9]. Our denoising results can be replicated with a publicly available toolbox.
    No preview · Chapter · Sep 2013
  • Christian J. Schuler · Harold Christopher Burger · Stefan Harmeling · Bernhard Scholkopf
    [Show abstract] [Hide abstract]
    ABSTRACT: Image deconvolution is the ill-posed problem of recovering a sharp image, given a blurry one generated by a convolution. In this work, we deal with space-invariant non-blind deconvolution. Currently, the most successful methods involve a regularized inversion of the blur in Fourier domain as a first step. This step amplifies and colors the noise, and corrupts the image information. In a second (and arguably more difficult) step, one then needs to remove the colored noise, typically using a cleverly engineered algorithm. However, the methods based on this two-step approach do not properly address the fact that the image information has been corrupted. In this work, we also rely on a two-step procedure, but learn the second step on a large dataset of natural images, using a neural network. We will show that this approach outperforms the current state-of-the-art on a large dataset of artificially blurred images. We demonstrate the practical applicability of our method in a real-world example with photographic out-of-focus blur.
    No preview · Conference Paper · Jun 2013
  • Mustafa Cavuşoğlu · Rolf Pohmann · Harold Christopher Burger · Kâmil Uludağ
    [Show abstract] [Hide abstract]
    ABSTRACT: Most experiments assume a global transit delay time with blood flowing from the tagging region to the imaging slice in plug flow without any dispersion of the magnetization. However, because of cardiac pulsation, nonuniform cross-sectional flow profile, and complex vessel networks, the transit delay time is not a single value but follows a distribution. In this study, we explored the regional effects of magnetization dispersion on quantitative perfusion imaging for varying transit times within a very large interval from the direct comparison of pulsed, pseudo-continuous, and dual-coil continuous arterial spin labeling encoding schemes. Longer distances between tagging and imaging region typically used for continuous tagging schemes enhance the regional bias on the quantitative cerebral blood flow measurement causing an underestimation up to 37% when plug flow is assumed as in the standard model. Magn Reson Med, 2012. © 2012 Wiley Periodicals, Inc.
    No preview · Article · Feb 2013 · Magnetic Resonance in Medicine
  • Source
    Harold Christopher Burger · Christian J. Schuler · Stefan Harmeling
    [Show abstract] [Hide abstract]
    ABSTRACT: Image denoising can be described as the problem of mapping from a noisy image to a noise-free image. In another paper, we show that multi-layer perceptrons can achieve outstanding image denoising performance for various types of noise (additive white Gaussian noise, mixed Poisson-Gaussian noise, JPEG artifacts, salt-and-pepper noise and noise resembling stripes). In this work we discuss in detail which trade-offs have to be considered during the training procedure. We will show how to achieve good results and which pitfalls to avoid. By analysing the activation patterns of the hidden units we are able to make observations regarding the functioning principle of multi-layer perceptrons trained for image denoising.
    Preview · Article · Nov 2012
  • Source
    Harold Christopher Burger · Christian J. Schuler · Stefan Harmeling
    [Show abstract] [Hide abstract]
    ABSTRACT: Image denoising can be described as the problem of mapping from a noisy image to a noise-free image. The best currently available denoising methods approximate this mapping with cleverly engineered algorithms. In this work we attempt to learn this mapping directly with plain multi layer perceptrons (MLP) applied to image patches. We will show that by training on large image databases we are able to outperform the current state-of-the-art image denoising methods. In addition, our method achieves results that are superior to one type of theoretical bound and goes a large way toward closing the gap with a second type of theoretical bound. Our approach is easily adapted to less extensively studied types of noise, such as mixed Poisson-Gaussian noise, JPEG artifacts, salt-and-pepper noise and noise resembling stripes, for which we achieve excellent results as well. We will show that combining a block-matching procedure with MLPs can further improve the results on certain images. In a second paper, we detail the training trade-offs and the inner mechanisms of our MLPs.
    Preview · Article · Nov 2012
  • Source
    H.C. Burger · C.J. Schuler · S. Harmeling
    [Show abstract] [Hide abstract]
    ABSTRACT: Image denoising can be described as the problem of mapping from a noisy image to a noise-free image. The best currently available denoising methods approximate this mapping with cleverly engineered algorithms. In this work we attempt to learn this mapping directly with a plain multi layer perceptron (MLP) applied to image patches. While this has been done before, we will show that by training on large image databases we are able to compete with the current state-of-the-art image denoising methods. Furthermore, our approach is easily adapted to less extensively studied types of noise (by merely exchanging the training data), for which we achieve excellent results as well.
    Preview · Conference Paper · Jun 2012
  • Harold Christopher Burger · Stefan Harmeling
    [Show abstract] [Hide abstract]
    ABSTRACT: Many state-of-the-art denoising algorithms focus on recovering high-frequency details in noisy images. However, images corrupted by large amounts of noise are also degraded in the lower frequencies. Thus properly handling all frequency bands allows us to better denoise in such regimes. To improve existing denoising algorithms we propose a meta-procedure that applies existing denoising algorithms across different scales and combines the resulting images into a single denoised image. With a comprehensive evaluation we show that the performance of many state-of-the-art denoising algorithms can be improved.
    No preview · Conference Paper · Aug 2011
  • Harold Christopher Burger · Bernhard Scholkopf · Stefan Harmeling
    [Show abstract] [Hide abstract]
    ABSTRACT: For digital photographs of astronomical objects, where exposure times are usually long and ISO settings high, the so-called dark-current is a significant source of noise. Dark-current refers to thermally generated electrons and is therefore present even in the absence of light. This paper presents a novel approach for denoising astronomical images that have been corrupted by dark-current noise. Our method relies on a probabilistic description of the darkcurrent of each pixel of a given camera. The noise model is then combined with an image prior which is adapted to astronomical images. In a laboratory environment, we use a black and white CCD camera containing a cooling unit and show that our method is superior to existing methods in terms of root mean squared error. Furthermore, we show that our method is practically relevant by providing visually more appealing results on astronomical photographs taken with a single lens reflex CMOS camera.
    No preview · Article · Apr 2011
  • HC Burger · S Harmeling

    No preview · Article · Jan 2011