[Ed Note: The following post is part of the TLF Editorial Board Test 2019-20. It has been authored by Shrudula Murthy, a second year student of NALSAR University of Law.]
“My father goes to office and my mother cooks food for us”
“My sister is not allowed to play with my car, she is supposed to play with her barbie”
Years of demarcating, categorizing and classifying humans on the basis of their gender and characteristic traits is still ingrained in our lives even at the peak of development. Mankind has reached the epitome of greatness in all spheres. It is assumed that in a few years, all jobs will be taken over by technology. Humans are working on inventions thought to exist only in our imagination. Soon enough, refrigerators will be ordering our groceries, we will be travelling to work in driverless-flying cars, time travel could become reality and so on. However, all these developments have one glaring problem. It is the increase in the gender biases and the revival of certain stereotypes stemming from it.
Women have struggled for years to break out of the shambles of oppression and categorization. They have transgressed boundaries and are now almost at par with the men in their field. However, it is not only the society and people who consider women as the inferior of the two genders but women themselves who have been indoctrinated into believing that certain jobs are not meant for them. Certain notions and prevailing ideologies have been ingrained into women about their role in society as well as their capabilities. It is very crucial for business leaders to implement programs that may help with overcoming bias in the workplace.
Technology and artificial intelligence are carrying forward centuries of misconceptions which were prevalent at different times in society. Thus, even though development is going forward at a tremendous pace, it is carrying forward with it a huge social problem. These problems will eventually get ingrained or baked into the future and these biases will become normalised. This paper seeks to highlight few such biases prevalent in technology and the possible way ahead.
Artificial intelligence is considered to be one of the greatest achievements of mankind so far. It comes very close to recreating an actual human, which has the capability to think, solve logical questions and nowadays even emote humanely.
“AI is defined as a branch of computer science concerned with modelling intelligent human behaviour on a computer”.
Data driven models have an increasing appearance in everyday functioning and have an impact on people and their ability to progress. As AI comes extremely close to human capability, it carries with it the innumerable prejudices and stereotypes which humans have carried along with them for centuries.
An experience with a recent google Algorithm on syllogism showed the following results.
Man: doctor
Woman: Nurse
This algorithm was created by scouring through millions of books and gathering information from the same. It automatically equated the occupations and the gender of the people. There are certain notions about the jobs and occupations apt for men and those apt for women. However, today people are becoming more and more aware about the increasing gender diversity in all the fields and women are more competent in jobs previously reserved for men. In this emerging scenario artificial intelligence has an extremely detrimental effect. AI has these prevailing reservations which it ends up propagating to others. As a result, women tend to become hesitant to enter certain fields.
Many of these data driven algorithms are used by multinational companies to hire suitable employees. In this situation the machine is fed with data collected from previous suitable candidates, their level of qualifications, skill sets required etc. It might have been possible that in the past record of the company, these qualifications and posts were held by one particular group of people say men. Now when the names of women appear in the list of potential candidates the AI machine tends to overlook the possibility of a suitable woman candidate because it is programmed to equate the names of men with the possibility of a successful recruitment. In this manner women are further discouraged from maximising their potential and coming out of the bubble of stereotypes which they have been indoctrinated into. Emerging scenarios, where women are equally successful tend to be overshadowed by the prevailing notions that men are the best fit for the job.
The face recognition feature which is a trending feature among smart phones is also setting out norms and ideal standards which each gender must fit into. In several instances leading icons such as Oprah Winfrey, Michelle Obama etc were categorised as “male” by the AI machine because they did not fit into the general definition of what women must look like. In a time where people are fighting for the inclusivity of gender fluid people, it is indeed a shame that technology is not able to keep up with the same.
Siri, Alexa, Google Home and every other automatic AI machine is a designated woman. Most people are unaware of the fact, that there is an option for a male Siri. Even the few people who have tested and tried using the male Siri did not react positively to it. They preferred the female version of it. This is another clear example of AI propagating a sexist bias. It is a continuation of the image of the female receptionist. The concept of a male receptionist never existed and one would expect almost naturally for a ‘good looking woman’ to be sitting behind the reception desk. In the same manner AI machines are women because they are assumed to be subservient to the needs of everyone. A woman can never say ‘no’ to anything and thus the sound of a woman in such machines is expected. On the other hand, machines designed to help with accounting, business and other aspects related to work, are usually portrayed to be men. This puts in an image of the nature and ability of men and women and how they continue to fulfill their designated roles even as machines. Humans have reached a level where even the machines they are inventing have the same notions and misconceptions as they do.
The bitten piece of apple, which constitutes the famous logo for the ‘apple’ company has its story in a historical setting. It is supposed to represent the ‘bite of apple’ which eve took in the beginning of civilization on earth. The female cyborg replaces eve in this scenario and the primary association of the symbol with mankind is that it represents the most vulnerable point for humans. Man is assumed to be at his weakest around a woman. It portrays the deceptive appearance of a woman used to entice men. This is yet another way in which the technological sector is enhancing its marketing by propagating and advertising female traits to appease the mass population.
The reason behind the presence of these ideologies in artificial intelligence is that the algorithm designed, portrays the opinions of those who programme it. Algorithm is essentially the encoding of thoughts and opinions of those who make it. Even though initially the job of data coding and encryption was dominated by women because the men were busy in war, nowadays it is quite the opposite. The entire technological sphere is dominated by men with a minuscule portion held by women. It is obvious that the data fed into the AI machine stems from the thoughts and ideologies prevalent in their makers.
By increasing the diversity of people working in technology, The AI machine would be more inclusive and less biased. Women working in technology could ensure that the algorithms fed into the machines are neutral and can thus work effectively. Neutral AI machines would go a long way in actually reducing the gender biases in the world. Women will enter various other fields because hiring will no longer be done by humans, who inevitably have certain ideologies ingrained in them but by machines, who have a completely neutral point of view. This will mark the beginning of a new era, where technology is not controlled by a handful of people but by a huge plethora of diverse people. It can be seen as a starting step towards a future in which talent and calibre is recognized as it is, without any other factors influencing it.