November 2024
·
3 Reads
Astronomy and Astrophysics
Deuterium fractionation is a well-established evolutionary tracer in low-mass star formation, but its applicability to the high-mass regime remains an open question. In this context, the abundances and ratios of different deuterated species have often been proposed as reliable evolutionary indicators for different stages of the high-mass star formation process. In this study, we investigate the role of NH and key deuterated molecules (o-HD and ND) as tracers of the different stages of the high-mass star formation process. We assess whether their abundance ratios can serve as reliable evolutionary indicators. We conducted APEX observations of _ and in a sample of 40 high-mass clumps at different evolutionary stages, selected from the ATLASGAL survey. Molecular column densities and abundances relative to H, X, were derived through spectral line modelling, both under local thermodynamic equilibrium (LTE) and non-LTE conditions. The column densities show the smallest deviation from LTE conditions when derived under non-LTE assumptions. In contrast shows the largest discrepancy between the column densities derived from LTE and non-LTE. In all the cases discussed, we found that X(o-HD) decreases more significantly with each respective evolutionary stage than in the case of X(ND); whereas X(NH) increases slightly. Therefore, the validity of the X(o-HD)/X(ND) ratio as a reliable evolutionary indicator, recently proposed as a promising tracer of the different evolutionary stages, was not observed for this sample. While the deuteration fraction derived from ND and NH clearly decreases with clump evolution, the interpretation of this trend is complex, given the different distribution of the two tracers. Our results suggest that a careful consideration of the observational biases and beam-dilution effects are crucial for an accurate interpretation of the evolution of the deuteration process during the high-mass star formation process.