In this paper, a family of support vector novelty detection (or SVND) in hidden space is presented. Firstly a hidden-space SVND (or HSVND) algorithm is proposed. The data in an input space is mapped into a hidden space by using some hidden function. In the hidden space, we can perform the SVND with linear kernel. The advantage of HSVND over SVND is that a hidden function can be an arbitrary real-valued function including Mercer kernels and their combination. In SVND algorithm, only Mercer kernels can be used. Unfortunately, the mapped data may be non-spherical distributed data. In this case, HSVND works badly. Secondly a whitening HSVND (WHSVND) is proposed. We perform whitening method on the mapped data and then the SVND with linear kernel. The whitened data is spherical approximately. We prove that WHSVND is identical to whitening SVND (WSVND) if a single and same Mercer kernel is taken as a mapping function. In fact, above algorithms cannot work well if the data locates a subspace of input sample space or is rank deficiency. In this case, a mapping function only maps data from one subspace to another subspace and cannot make data spanning whole sample space. Thus noised and whitening HSVND (NWHSVND) is present for kernel mapping function. Before the data is mapped, some small randomly noise is put on it. Based on the noised data, the data is mapped into the hidden space. Next we whiten the mapped data, and perform the SVND with linear kernel. Experiments are performed on artificial and real-world data.