The decisions made by Artificial Intelligence (AI) systems are critical due to their growing usage in sensitive areas such as recruitment, criminal justice, and healthcare. It is dramatically significant to detect and measure AI bias to mitigate the effects of bias by making these systems more transparent, explainable, and auditable. In this study, we focus on gender bias to investigate the effect of gender imbalance in medical imaging dataset when applying AI models to detect Covid-19. We perform an analysis to measure gender bias in the diagnosis of medical imaging data using deep learning-based methods. We primarily examine the distribution of samples based on metadata and target labels. In the training phase, we conduct experiments to reveal that gender imbalance produces a biased model. For this purpose, we train a model using both a fully gender-balanced and immensely imbalanced dataset unique to a specific gender. To show that the inferences are generalizable, we apply several deep learning-based solutions including pre-trained models. We compare the performance of different models for exploring gender bias. We observe a significant difference in classification performances between trained models using the imbalanced dataset and balanced dataset in terms of gender. We confirm a similar tendency when using different deep learning methods. Consequently, our experimental results show that gender-imbalance in medical imaging data produces biased decisions in Covid-19 detection. In this study, we explore a gender bias in the deep learning aided Covid-19 diagnosis of the gender-unbalanced medical image data.