Nursing is a profession where women make up a large percentage of the workforce. The most significant exception is Francophone Africa, where the majority of nurses are men in numerous countries, and there have been other exceptions in the past. According to the WHO's 2020 State of the World's Nursing report, men make up about 10% of the global nursing workforce. Nursing has increasingly become more gender-inclusive since the 1960s. Male nurses have a low rate of employment for a variety of reasons, including stereotypes about nursing, a lack of male interest in the profession, low pay, nursing job titles such as Sister and Matron, and the perception that male nurses will have difficulty performing their duties in the workplace. The field of nursing benefits from diversity. Men in the field can offer a unique perspective and assist in making some patients feel more at ease. Nursing care is needed by individuals from various walks of life and backgrounds, and the professionals providing it should ideally be a reflection of the people they serve.