[ad_1]
Africa has been a continent exploited since the European
scramble carved it out in lines of a draughtsman’s crude
design. Its resources have been pilfered; its peoples
enslaved for reasons of trade and profit; its political
conditions manipulated to favour predatory
companies.
A similar pattern is detectable in the
digital world. The slavers have replaced their human product
with data and information. The ubiquitous sharing of
information on social media platforms has brought with it a
fair share of dangerous ills. A $2 billion lawsuit against
Facebook’s parent company Meta, which was filed in
Kenya’s High Court this month, is a case in
point.
The petitioners, Kenyan rights group Katiba
Institute, and Ethiopian researchers Fisseha Tekle and
Abrham Meareg, argue that Meta failed to employ sufficient
safety measures on the Facebook platform which would have
prevented the incitement of lethal conflict. Most notable
were the deaths of Ethiopians arising from the Tigray War, a
conflict that has claimed tens of thousands of lives, and
seen the displacement of 2.1 million
Ethiopians.
Abrham Meareg’s case is particularly
harrowing. His father, chemistry Professor Meareg Amare
Abrha and an ethnic Tigrayan, was singled out and harassed
in a number of violent and racially inflammatory Facebook
posts. Two posts screeching
with slander (complicity in massacres; aiding military
raids, corruption and theft) and death threats found their
way onto a page named “BDU STAFF”, which sported over
50,000 followers at the time.
The posts also included
the professor’s picture and home locality. Complaints to
the platform by his son received no response. The posts
remained up for four weeks. Meareg Amare was subsequently
assassinated after leaving his work at Bahir Dar University.
According to his son, the killing “was orchestrated by
both state and non-state actors.”
Rosa Curling,
Director of the non-profit campaign outfit Foxglove, an
organisation supporting the petitioners, is convinced that
the professor would still be alive had the posts been
removed. She also makes
a salient point. “Sadly, ‘engaging’ posts are
often violent or shocking, because people react to them,
share them, comment on them. All those reactions mean the
Facebook algorithm promotes the post more, and can make hate
posts and violence go viral, and spread even
further.”
Meta, in response, has trotted out the
standard, disingenuous deflection, giving us an insight into
a parallel universe of compliance. “We have strict rules
about what is and isn’t allowed on Facebook and
Instagram,” declared
Meta spokesperson Mike DelMoro. “Feedback from local civil
society organizations and international institutions guides
our security and integrity work in
Ethiopia.”
Meta’s content moderation hub for
Eastern and Southern Africa is located in Nairobi. But
questions have been raised about how adequate its staffing
and resourcing arrangements are. DelMoro claims there is
nothing of interest on that score. “We employ staff with
local knowledge and expertise, and continue to develop our
skills to detect harmful content in the country’s most
commonly spoken languages, including Amharic, Oromo, Somali
and Tigrinya.”
The treatment of staff at Meta’s
main subcontractor for content moderation in Africa, Sama,
is also the subject of another
lawsuit. That action alleges the use of forced labour
and human trafficking, unequal labour relations, attacks on
unions and a failure to provide sufficient mental health and
psychosocial support to hired moderators.
Abrham
Meareg and his fellow petitioners are demanding, along with
Facebook’s halting of viral hate and demoting of content
inciting violence, the employment of greater numbers of
content moderators versed in a range of languages. The legal
filing also demands that Meta issue an apology for the
professor’s death and establish a restitution fund for
victims of hate speech or misinformation posted on the
company’s platforms, including Facebook and
Instagram.
Such actions are becoming regular fare. All
tend to follow a similar blueprint. In December last year, a
class
action complaint was lodged with the northern district
court in San Francisco claiming that Facebook was “willing
to trade the lives of the Rohingya people for better market
penetration in a small country in south-east Asia.” The
language proved instructive: a company, operating much in
the traditional mercantilist mould, a plunderer of
resources, its gold the product of surveillance
capitalism.
Lawyers representing the petitioners also
submitted a letter to Facebook’s UK office stating that
their clients had been subjected to acts of “serious
violence, murder and/or other grave human rights abuses”
as part of a genocidal campaign waged by the military regime
and aligned extremists in Myanmar.
As with the case
lodged in Kenyan High Court, the grounds against Facebook
were that its algorithms amplified hate speech against the
Rohingya populace; it failed to adequately invest in local
moderators and diligent fact-checkers; it failed to remove
posts inciting violence against the Rohingya; and it did not
shut down or delete specific accounts, groups and pages that
encouraged ethnic violence.
Despite such actions,
there is nothing in the way Meta operates to suggest a
change in approach. As far long as the wallets stretch,
platforms such as Facebook will continue to use devilish
algorithms to boost bad behaviour. In the scheme of things,
such behaviour, however hateful or misinformed, sells. The
dragon of surveillance capitalism continues to thrive with
fire breathing menace.
Dr. Binoy Kampmark was a
Commonwealth Scholar at Selwyn College, Cambridge. He
currently lectures at RMIT University. Email: bkampmark@gmail.com
© Scoop Media
[ad_2]
Source link