... ChatGPT provides superficial knowledge and has been reported to provide very well constructed responses that are completely falsified and misleading known as "hallucinations" [5], [13], [20]- [22], [29], [42], [63], [67], [68], [76], [77], [80], [89], [93], [97], [99], [102], [104], [110], [135], [138], [140]- [143], [151]- [153]. One very alarming concern is that the tool tend to provide citations to the provided falsified response which are themselves are fake and non-existing [19], [20], [22], [24], [28], [29], [34], [39], [41], [43], [45], [52], [53], [59], [63], [64], [69], [72], [77], [81], [84], [93], [102], [103], [110], [117], [135], [137], [138], [141]- [144], [149]- [152], [154]- [157]. For example, in one study, out of the 23 references that were provided by ChatGPT , only 14 were accurate, 6 seems to have been completely made up and 4 existed but were attributed to the wrong author [158]. ...