Fig 6 - uploaded by Krzysztof M. Gorski
Content may be subject to copyright.
— Comparison of HEALPix with other tessellations including Quadrilateralized Spherical Cube (or QuadCube, used by NASA as data structure for COBE mission products), icosahedral tessellation of the sphere, and Equidistant Coordinate Partition, or the ’geographic grid.’ The shaded areas illustrate the subsets of all pixels on the sky for which the associated Legendre functions have to be computed in order to perform the spherical harmonic transforms. This plot demonstrates why the iso-latitude ECP and HEALPix points-sets support faster computation of spherical harmonic transforms than the QuadCube, the icoshedral grid, and any non iso-latitude construction. 

— Comparison of HEALPix with other tessellations including Quadrilateralized Spherical Cube (or QuadCube, used by NASA as data structure for COBE mission products), icosahedral tessellation of the sphere, and Equidistant Coordinate Partition, or the ’geographic grid.’ The shaded areas illustrate the subsets of all pixels on the sky for which the associated Legendre functions have to be computed in order to perform the spherical harmonic transforms. This plot demonstrates why the iso-latitude ECP and HEALPix points-sets support faster computation of spherical harmonic transforms than the QuadCube, the icoshedral grid, and any non iso-latitude construction. 

Source publication
Article
Full-text available
HEALPix -- the Hierarchical Equal Area iso-Latitude Pixelization -- is a versatile data structure with an associated library of computational algorithms and visualization software that supports fast scientific applications executable directly on very large volumes of astronomical data and large area surveys in the form of discretized spherical maps...

Context in source publication

Context 1
... other grids, which are not iso-latitude constrainted, the extra computing time wasted on non-optimal generation of the associated Legendre functions, typically results with computational performance of order ∼ N 2 pix . This geometrical aspect of discrete spherical transform computation is illustrated in Fig.6, which compares HEALPix with other tessellations including Quadrilateralized Spherical Cube (or QuadCube, used by NASA as data structure for COBE mission products), icosahedral tessellation of the sphere, and Equidistant Coordinate Partition, or the "geographic grid." ...

Citations

... Though this Millipede method was designed as an energy reconstruction, it can be applied repeatedly across a grid of different track hypotheses, from which the track with the largest likelihood can then be selected as the best-fit. Scanning is often performed on pre-defined tilings of the sky known as HEALpix grids [165]. Millipede can also be applied iteratively, evaluated on a finer pixelisation grid centred on the best fit of a previous coarser grid, yielding successively-finer precision on the track direction. ...
Thesis
    Das IceCube Neutrino Observatory, das größte Neutrino-Observatorium der Welt, entdeckte 2013 erstmals einen Fluss hochenergetischer Neutrinos. Diese Neutrinos müssen von astrophysikalischen Beschleunigern erzeugt werden, aber ihr genauer Ursprung ist bisher unbekannt. Vorgeschlagene Neutrinoquellen sind Gezeitenkatastrophen (Tidal Disruption Events, TDEs), Ereignisse bei denen Sterne zerfallen, wenn sie supermassiven Schwarzen Löchern zu nahe kommen. In dieser Doktorarbeit wurde erstmals nach Korrelationen zwischen Neutrinos und TDEs gesucht, wobei eine Zusammenstellung veröffentlichter TDEs und ein IceCube-Datensatz von einer Million Myon-Neutrinos mit GeV-PeV-Energien von verwendet wurde. Es wurde keine signifikante Korrelation gefunden, sodass der Beitrag von TDEs ohne relativistische Jets auf 0-38,0% des gesamten astrophysikalischen Neutrinoflusses begrenzt werden kann. Der Beitrag von TDEs mit relativistischen Jets wurde auf 0-3,0% des Gesamtflusses begrenzt. IceCube veröffentlicht auch hochenergetische (>100 TeV) Myon-Neutrino-Ereignisse in Form von automatischen, öffentlichen Echtzeit-‘Neutrinoalerts’. Im Rahmen dieser Arbeit wurde die Lokalisierung von 22 solcher Neutrinoalerts mit dem optischen Zwicky Transient Facility (ZTF) Teleskop beobachtet, um nach möglichen elektromagnetischen Gegenstücken zu Neutrinos zu suchen. Mit diesem Neutrino-Nachfolgebeobachtungsprogramm wurde die helle TDE AT2019dsg als mutmaßliche Neutrinoquelle identifiziert. Die Wahrscheinlichkeit, solch eine helle TDE zufällig zu finden, beträgt 0,2%. Die Assoziation bedeutet, dass TDEs 3-100% der astrophysikalischen Neutrino-Alerts von IceCube ausmachen. Zusammengenommen deuten diese beiden Ergebnisse darauf hin, dass TDEs einen subdominanten Anteil des astrophysikalischen Neutrinoflusses bei hohen Energien emittieren. Die Assoziation des Neutrinoalerts IC191001A mit AT2019dsg ist erst das zweite Mal, dass ein hochenergetisches Neutrino mit einer mutmaßlichen astrophysikalischen Quelle in Verbindung gebracht werden konnte.
    ... The GWB power can be parameterized using HEALPix sky pixelization (Gorski et al. 2005), where each equal-area pixel is independent of the pixels surrounding it: ...
    ... The number of pixels is set by = N N 12 pix side 2 and N side defines the tessellation of the healpix sky (Gorski et al. 2005). The rule of thumb for PTAs is to have N pix N cc (Cornish & van Haasteren 2014;Romano & Cornish 2017). ...
    Full-text available
    Article
    Statistical anisotropy in the nanohertz-frequency gravitational wave background (GWB) is expected to be detected by pulsar timing arrays (PTAs) in the near future. By developing a frequentist statistical framework that intrinsically restricts the GWB power to be positive, we establish scaling relations for multipole-dependent anisotropy decision thresholds that are a function of the noise properties, timing baselines, and cadences of the pulsars in a PTA. We verify that (i) a larger number of pulsars, and (ii) factors that lead to lower uncertainty on spatial cross-correlation measurements between pulsars, lead to a higher overall GWB signal-to-noise ratio, and lower anisotropy decision thresholds with which to reject the null hypothesis of isotropy. Using conservative simulations of realistic NANOGrav data sets, we predict that an anisotropic GWB with angular power C l =1 > 0.3 C l =0 may be sufficient to produce tension with isotropy at the p = 3 × 10 ⁻³ (∼3 σ ) level in near-future NANOGrav data with a 20 yr baseline. We present ready-to-use scaling relationships that can map these thresholds to any number of pulsars, configuration of pulsar noise properties, or sky coverage. We discuss how PTAs can improve the detection prospects for anisotropy, as well as how our methods can be adapted for more versatile searches.
    ... The sky signal was simulated using the package HEALPix (Hierarchical Equal Area isoLatitude Pixelization of a sphere; Gorski et al. 2005). The left panel of Figure 2 shows the MWA baseline distribution used for the simulations. ...
    Preprint
    Intensity mapping with the redshifted 21-cm line is an emerging tool in cosmology. Drift scan observations, where the antennas are fixed to the ground and the telescope's pointing center (PC) changes continuously on the sky due to earth's rotation, provide broad sky coverage and sustained instrumental stability needed for 21-cm intensity mapping. Here we present the Tracking Tapered Grided Estimator (TTGE) to quantify the power spectrum of the sky signal estimated directly from the visibilities measured in drift scan radio interferometric observations. The TTGE uses the data from the different PC to estimate the power spectrum of the signal from a small angular region located around a fixed tracking center (TC). The size of this angular region is decided by a suitably chosen tapering window function which serves to reduce the foreground contamination from bright sources located at large angles from the TC. It is possible to cover the angular footprint of the drift scan observations using multiple TC, and combine the estimated power spectra to increase the signal to noise ratio. Here we have validated the TTGE using simulations of $154 \, {\rm MHz}$ MWA drift scan observations. We show that the TTGE can recover the input model angular power spectrum $C_{\ell}$ within $20 \%$ accuracy over the $\ell$ range $40 < \ell < 700$.
    ... We started with both Planck 2018 maps. We rotated these maps in harmonic space, using the HEALPix package [44], so that the CS direction points towards the north pole and computed the average ∆T for 12 rings of pixels with width 2 • centered in the CS. For our Gaussian void profile we computed the χ 2 varying δ 0 , r V and D V as in equation (4.2). ...
    Full-text available
    Preprint
    The Cosmic Microwave Background (CMB) anisotropies are thought to be statistically isotropic and Gaussian. However, several anomalies are observed, including the CMB Cold Spot, an unexpected cold $\sim 10^{\circ}$ region with $p$-value $\lesssim 0.01$ in standard $\Lambda$CDM. One of the proposed origins of the Cold Spot is an unusually large void on the line of sight, that would generate a cold region through the combination of integrated Sachs-Wolfe and Rees-Sciama effects. In the past decade extensive searches were conducted in large scale structure surveys, both in optical and infrared, in the same area for $z \lesssim 1$ and did find evidence of large voids, but of depth and size able to account for only a fraction of the anomaly. Here we analyze the lensing signal in the Planck CMB data and rule out the hypothesis that the Cold Spot could be due to a large void located anywhere between us and the surface of last scattering. In particular, computing the evidence ratio we find that a model with a large void is disfavored compared to $\Lambda$CDM, with odds 1 : 13 (1 : 20) for SMICA (NILC) maps, compared to the original odds 56 : 1 (21 : 1) using temperature data alone.
    ... The DES collaboration develops spatial templates for different observing conditions and potential contaminants in the survey footprint by creating H P (Gorski et al. 2005) sky maps (at NSIDE = Figure 5. Visualization of color images of random galaxies from each of the three redshift bins defined in §3.3. As apparent from Fig. 3, the first bin is made predominantly of red galaxies and then the selection moves to bluer and fainter galaxies for the second and third bin. ...
    Preprint
    The fiducial cosmological analyses of imaging galaxy surveys like the Dark Energy Survey (DES) typically probe the Universe at redshifts $z < 1$. This is mainly because of the limited depth of these surveys, and also because such analyses rely heavily on galaxy lensing, which is more efficient at low redshifts. In this work we present the selection and characterization of high-redshift galaxy samples using DES Year 3 data, and the analysis of their galaxy clustering measurements. In particular, we use galaxies that are fainter than those used in the previous DES Year 3 analyses and a Bayesian redshift scheme to define three tomographic bins with mean redshifts around $z \sim 0.9$, $1.2$ and $1.5$, which significantly extend the redshift coverage of the fiducial DES Year 3 analysis. These samples contain a total of about 9 million galaxies, and their galaxy density is more than 2 times higher than those in the DES Year 3 fiducial case. We characterize the redshift uncertainties of the samples, including the usage of various spectroscopic and high-quality redshift samples, and we develop a machine-learning method to correct for correlations between galaxy density and survey observing conditions. The analysis of galaxy clustering measurements, with a total signal-to-noise $S/N \sim 70$ after scale cuts, yields robust cosmological constraints on a combination of the fraction of matter in the Universe $\Omega_m$ and the Hubble parameter $h$, $\Omega_m = 0.195^{+0.023}_{-0.018}$, and 2-3% measurements of the amplitude of the galaxy clustering signals, probing galaxy bias and the amplitude of matter fluctuations, $b \sigma_8$. A companion paper $\textit{(in preparation)}$ will present the cross-correlations of these high-$z$ samples with CMB lensing from Planck and SPT, and the cosmological analysis of those measurements in combination with the galaxy clustering presented in this work.
    ... We also thank Carmelo Evoli for advice on the HERMES code. We acknowledge use of the HEALPix 8 package (Gorski et al. 2005). G.S. acknowledges membership in the International Max Planck Research School for Astronomy and Cosmic Physics at the University of Heidelberg (IMPRS-HD). ...
    Full-text available
    Preprint
    In the standard picture of galactic cosmic rays, a diffuse flux of high-energy gamma-rays and neutrinos is produced from inelastic collisions of cosmic ray nuclei with the interstellar gas. The neutrino flux is a guaranteed signal for high-energy neutrino observatories such as IceCube, but has not been found yet. Experimental searches for this flux constitute an important test of the standard picture of galactic cosmic rays. Both the observation and non-observation would allow important implications for the physics of cosmic ray acceleration and transport. We present DINECRAFT, a new model of galactic diffuse high-energy gamma-rays and neutrinos, fitted to recent cosmic ray data from AMS-02, DAMPE, IceTop as well as KASCADE. We quantify the uncertainties for the predicted emission from the cosmic ray model, but also from the choice of source distribution, gas maps and cross-sections. We consider the possibility of a contribution from unresolved sources. Our model predictions exhibit significant deviations from older models. Our fiducial model is available at https://doi.org/10.5281/zenodo.7373010 .
    ... massive star are modelled with a black body assumption given the effective temperature of the star, which is interpolated from the stellar evolution tracks. We inject the ionising photons into the cell where the sink particle sits and propagate the radiation along 48 rays normal to an equal area isolatitude pixelation of a sphere calculated with the HEALP algorithm (Gorski et al. 2005). An important advantage of T R is its use of the oct-tree structure and the backward radiative transfer approach. ...
    Preprint
    We present magnetohydrodynamic (MHD) simulations of the star-forming multiphase interstellar medium (ISM) in stratified galactic patches with gas surface densities $\Sigma_\mathrm{gas} =$ 10, 30, 50, and 100 $\mathrm{M_\odot\,pc^{-2}}$. The SILCC project simulation framework accounts for non-equilibrium thermal and chemical processes in the warm and cold ISM. The sink-based star formation and feedback model includes stellar winds, hydrogen-ionising UV radiation, core-collapse supernovae, and cosmic ray (CR) injection and diffusion. The simulations follow the observed relation between $\Sigma_\mathrm{gas}$ and the star formation rate surface density $\Sigma_\mathrm{SFR}$. CRs qualitatively change the outflow phase structure. Without CRs, the outflows transition from a two-phase (warm and hot at 1 kpc) to a single-phase (hot at 2 kpc) structure. With CRs, the outflow always has three phases (cold, warm, and hot), dominated in mass by the warm phase. The impact of CRs on mass loading decreases for higher $\Sigma_\mathrm{gas}$ and the mass loading factors of the CR-supported outflows are of order unity independent of $\Sigma_\mathrm{SFR}$. Similar to observations, vertical velocity dispersions of the warm ionised medium (WIM) and the cold neutral medium (CNM) correlate with the star formation rate as $\sigma_\mathrm{z} \propto \Sigma_\mathrm{SFR}^a$, with $a \sim 0.20$. In the absence of stellar feedback, we find no correlation. The velocity dispersion of the WIM is a factor $\sim 2.2$ higher than that of the CNM, in agreement with local observations. For $\Sigma_\mathrm{SFR} \gtrsim 1.5 \times 10^{-2}\,\mathrm{M}_\odot\,\mathrm{yr}^{-1}\,\mathrm{kpc}^{-2}$ the WIM motions become supersonic.
    ... We obtained those colorcorrection coefficients from dedicated software routines provided by the Planck collaboration. 6 We transformed all HFI-Planck intensity maps as well as other skymaps used in this work into HEALPix standard (Górski et al. 2005) with a resolution parameter N side = 512 corresponding to mean pixel spacing of 0°.11. ...
    Full-text available
    Article
    Where dust and gas are uniformly mixed, atomic hydrogen can be traced through the detection of far-infrared (FIR) or UV emission of dust. We considered, for the origin of discrepancies observed between various direct and indirect tracers of gas outside the Galactic plane, possible corrections to the zero levels of the Planck High Frequency Instrument (HFI) detectors. We set the zero levels of the Planck-HFI skymaps as well as the 100 μ m map from COBE/DIRBE and IRAS from the correlation between FIR emission and atomic hydrogen column density excluding regions of lowest gas column density. A modified blackbody model fit to those new zero-subtracted maps led to significantly different maps of the opacity spectral index β and temperature T and an overall increase in the optical depth at 353 GHz τ 353 of 7.1 × 10 ⁻⁷ compared to the data release 2 Planck map. When comparing τ 353 and the H i column density, we observed a uniform spatial distribution of the opacity outside regions with dark neutral gas and CO except in various large-scale regions of low N H i that represent 25% of the sky. In those regions, we observed an average dust column density 45% higher than predictions based on N H i with a maximum of 250% toward the Lockman Hole region. From the average opacity σ e 353 = (8.9 ± 0.1) × 10 ⁻²⁷ cm ² , we deduced a dust-to-gas mass ratio of 0.53 × 10 ⁻² . We did not see evidence of dust associated with a Reynolds layer of ionized hydrogen. We measured a far-ultraviolet isotropic intensity of 137 ± 15 photons s ⁻¹ cm ⁻² sr ⁻¹ Å ⁻¹ in agreement with extragalactic flux predictions and a near-ultraviolet isotropic intensity of 378 ± 45 photons s ⁻¹ cm ⁻² sr ⁻¹ Å ⁻¹ corresponding to twice the predicted flux.
    ... While there is only one realization for each of the foreground models, we add these to multiple realizations of CMB and noise. The simulations are done at a HEALPix (Górski et al. 2005) resolution N side = 512, except when we are doing lensing reconstruction and map-based delensing, for which additional maps are rendered at N side = 2048; see Section 7. The foreground component uses the same space information at N side = 2048 as at N side = 512. ...
    Full-text available
    Preprint
    PICO is a concept for a NASA probe-scale mission aiming to detect or constrain the tensor to scalar ratio $r$, a parameter that quantifies the amplitude of inflationary gravity waves. We carry out map-based component separation on simulations with five foreground models and input $r$ values $r_{in}=0$ and $r_{in} = 0.003$. We forecast $r$ determinations using a Gaussian likelihood assuming either no delensing or a residual lensing factor $A_{\rm lens}$ = 27%. By implementing the first full-sky, post component-separation, map-domain delensing, we show that PICO should be able to achieve $A_{\rm lens}$ = 22% - 24%. For four of the five foreground models we find that PICO would be able to set the constraints $r < 1.3 \times 10^{-4} \,\, \mbox{to} \,\, r <2.7 \times 10^{-4}\, (95\%)$ if $r_{in}=0$, the strongest constraints of any foreseeable instrument. For these models, $r=0.003$ is recovered with confidence levels between $18\sigma$ and $27\sigma$. We find weaker, and in some cases significantly biased, upper limits when removing few low or high frequency bands. The fifth model gives a $3\sigma$ detection when $r_{in}=0$ and a $3\sigma$ bias with $r_{in} = 0.003$. However, by correlating $r$ determinations from many small 2.5% sky areas with the mission's 555 GHz data we identify and mitigate the bias. This analysis underscores the importance of large sky coverage. We show that when only low multipoles $\ell \leq 12$ are used, the non-Gaussian shape of the true likelihood gives uncertainties that are on average 30% larger than a Gaussian approximation.
    ... We transformed all HFI-Planck intensity maps as well as other skymaps used in this work into HEALPix standard (Górski et al. 2005) with a resolution parameter N side = 512 corresponding to mean pixel spacing of 0 • .11. ...
    Preprint
    Where dust and gas are uniformly mixed, atomic hydrogen can be traced through the detection of far-infrared (FIR) or UV emission of dust. We considered, for the origin of discrepancies observed between various direct and indirect tracers of gas outside the Galactic plane, possible corrections to the zero levels of the Planck-HFI detectors. We set the zero levels of the Planck High Frequency Instrument (HFI) skymaps as well as the 100 $\mu$m map from COBE/DIRBE and IRAS from the correlation between FIR emission and atomic hydrogen column density excluding regions of lowest gas column density. A modified blackbody model fit to those new zero-subtracted maps led to significantly different maps of the opacity spectral index $\beta$ and temperature $T$ and an overall increase in the optical depth at 353 GHz $\tau_{353}$ of 7.1$\times$10$^{-7}$ compared to the data release 2 Planck map. When comparing $\tau_{353}$ and the HI column density, we observed a uniform spatial distribution of the opacity outside regions with dark neutral gas and CO except in various large-scale regions of low NHI that represent 25% of the sky. In those regions, we observed an average dust column density 45% higher than predictions based on NHI with a maximum of 250% toward the Lockman Hole region. From the average opacity $\sigma_{e 353}$=(8.9$\pm$0.1)$\times$10$^{-27}$ cm$^2$ we deduced a dust-to-gas mass ratio of 0.53$\times$10$^{-2}$. We did not see evidence of dust associated to a Reynolds layer of ionized hydrogen. We measured a far-ultraviolet isotropic intensity of 137$\pm$15 photons s$^{-1}$cm$^{-2}$sr$^{-1}$$A$$^{-1}$ in agreement with extragalactic flux predictions and a near-ultraviolet isotropic intensity of 378$\pm$45 photons s$^{-1}$cm$^{-2}$sr$^{-1}$$A$$^{-1}$ corresponding to twice the predicted flux.