Gender inequality with Artificial Intelligence

Yash Pathak
3 min readNov 30, 2021


Gender Inequality in humans was inevitable due to the ability to create opinions and get attached to them. During the development of complex machine learning models, the data collected was proportionate to the population at that real-time. The gender-based quality of the population may vary upon various parameters including the culture, dominant factor in the Gross Domestic Product, profession, sex-ratio, etc.

Now talking about the real-world scenario, the tech industry is male dominant. According to World Economic Forum, only 22% of developers working in the field of Artificial Intelligence at a global scale are females. Tech organizations not only employ fewer women than men but also lose them at a faster rate. With this kind of biasness being in the workspace of developers, having it in the data is quiet discernible. Shall there be sexism rooted within the data, the machine learning models, or the artificial intelligence shall pick up that pattern and show the same sexist behaviour in their output.

Frequently, the professional occupation of a person is majorly accounted for the primary parameter distinguishing gender in a manner discriminative to women. A classy example can be as simple as the ‘Google Translate’ given by Dr. Munnera Banno. In Google Translate, if you provide some sentences involving profession associated to gender and translate it to a gender neutral language and then translate it back to the original language, you’ll get different results. The illustration below shall demonstrate the point stated above.

The translator will neither choose a man for both nor a woman for both. It is because the statistical probability based on the training-set data is that it’s most probably he who is going to be the president and it’s going to be she who is going to be cooking. Apart from the issue in the real world, the statistics is also reshaping the biasness in machines learning algorithms. Instead of helping taking decisions and helping humanity, the algorithms have simply become a mirror of the society. Biasness in machines can prove to be much more harmful than like previously seen in societies.

In abstract sense, it like poisoning non-living things with filthy ideologies. Apart from artificial intelligence, the auxiliary technology associated to devices primarily known for their AI-based functions implement functionalities encouraging discrimination against women. For Example, Amazon Alexa, Apple Siri, Google Assistant, all of them have a female voice by default. This feature impacts the psychology of someone who is submissive, who must follow orders.

Artificial Intelligence is not a threat in itself but is converted to one by unconscious decisions taken by the developers developing the algorithms and providing data. The amount of data and development has surged to such an extent that auditing the data is no more feasible. In this date, monitoring the data being fed to these machines is lot more important than developing the algorithms themselves. Like in the real world, ensuring the good quality of food and drinks being manufactured is not possible, because the auditing scope could not pace up exponentially like the food industry did; similarly, auditing and ensuring data authenticity and non-biasness is a challenge to the mankind.

The young generation today operates on technology from the morning light to the darkest night. The change in the mentality will be imperceptible yet a horrific process. Seeding the idea of gender inequality shall keep growing like virus. You may always reference the movie ‘Inception’ to understand this process. A simple idea yet extremely effectively drastic.

The challenge is to keep improving the quality of data and taking small steps everyday to ensure equality among all humans. It is not going to be an event but a process which needs to begin. Small changes today will be a part of a bigger change tomorrow. The mankind needs to understand the importance of gender equality. The patriarchy embedded in machines will shape the mindset of the future generations into admitting to the submissive behaviour of females.

As a new era begins, the challenges to fix the previously growing problems have also grown exponentially. Such imperceptible changes will never be noticed until one chooses to. But the light of hope to fix this shall always be visible. The countries’ governments shall also take necessary steps to make sure that such issues are fixed at the microscopic level.



Yash Pathak

An Intermediate Django developer exploring Python frameworks, with side interests in Data Structures and Algorithms.