The last decades have witnessed an unprecedented development in Information and Communication Technologies (ICTs) and Artificial Intelligence (AI), which have modified the way that interpersonal relationships are established and could also modify the way that human specificity is understood, even challenging the concept of human specialness. These technological advances have led some authors to anticipate that soon there will be machines built in our own image and to which we will relate as equals. However, we understand that subjectivity is a key element of all expressions of intelligence and emotion and is pivotal to consciousness itself, despite being overlooked in many reductionist arguments. We propose to bring the concept of authenticity to the discussion to distinguish different values in the possible expressions of intelligence, consciousness or emotion. Authenticity in this context would be deeply linked to subjective experience: for instance, an emotion is only authentic if it is felt in addition to being expressed. These subjective experiences, by definition, cannot be objectively measured. In order to overcome this limitation, we propose to use emergence as an objective surrogate for this authenticity: when behaviour is authentic, it emerges from lower-level domains rather than being imposed by programming or conditioning. In this paper we provide a framework for these concepts, with a double aim: stressing the importance of subjectivity and the difference between simulated processes and real ones, and proposing a filter based on emergence as a first guide in assessing the authenticity of these processes in AI.