We need to talk about this. With every year that passes, it gets more serious.

I have talked about this before, here, here, here, and here. I also gave a talk about it in Maputo during December of 2017. More recently, Anjana Susarla wrote about how it affects poorer people, especially in Africa, more than it does people with higher incomes.

Given how Internet network effects end up concentrating power into the hands of a few dominant digital platforms, it is no surprise that Big Tech companies like Google and Facebook keep popping up when we talk about algorithm bias (and by extension, Artificial Intelligence bias). Some people argue that it’s “much ado about nothing” given we are only (so far) talking about news feeds and search results, but it is getting more serious than “just” Google and Facebook, and, if we are not careful, Africa will suffer the negative effects of this bias.

Brisha Borden was rated a higher risk and higher probability to commit future crimes after a petty theft of items worth $80. Vernon Prater was rated a lower risk and lower probability to commit future crimes after shoplifting items worth $86,35. Both were assessed by an algorithm used by a private American company, Northpointe, which is used to allocate risk scores for prisoners. Despite Prater being a hardened criminal and Borden not, she was considered a higher criminal risk. The algorithm has been known on numerous occasions to allocate higher risk to black prisoners irrespective of the offenses committed relative to their white counterparts. Source: ProPublica

Shaping narratives on the web

The problem is becoming more serious and bigger. More importantly, it is slowly accelerating in how it shapes narratives on the web

One of the more recent examples of this is research which has revealed that Google’s search algorithm allegedly, and consistently, discriminates against women and black people. In this specific example, when you search for “woman” or “girl” on Google, the results will have you needing to possibly pick up your jaw from the flaw. As if that is not enough, and to prove the opposite of this, search for “unprofessional hair.”

Now, imagine, like me, your children using Google to do their research and homework.

What narrative are they being sold?

What picture of the world they live in is being painted?

Algorithms as a reflection of society

Google has argued, many a times, and I understand their argument, that their algorithm merely reflects society. However, it goes deeper than that. As Jonathan Cohn puts it:

"To make matters worse, Google suggests that I narrow down my search results with adjectives ranging from “attractive” to “skinny” to “pregnant.” In contrast, when searching for “men” (a category that also over represents whiteness), the first three adjectives are “cartoon,” “hair style” and “old.” These adjectives may be descriptive, but they also replicate the stereotype that women are primarily valued for their beauty and reproductive organs and men are important for their personality and wisdom."

There’s a flip side to this, and this where I partially agree with Google and something we need to also talk about.

According to Google, their Image Search “analyzes the text on the page adjacent to the image, the image caption and dozens of other factors to determine the image content,” and that’s where we, Africans, have to play our part.

African digital content platforms

We need to write. We need to podcast. We need to vlog.

We need to tell our stories on the Internet, specifically on our own platforms (not behind the walled gardens of social media platforms), our own websites, that are out on the open web. That no Jack, Mark or Larry can censor or suspend you from.

We must tell these stories ourselves, so that, when technology such as Artificial Intelligence reaches critical mass, it can find our data, our content, as told by us.

Share this via: