Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124

Mega MohanBBC World Service gender and identity correspondent
grains of sandA gruesome murder in her own family inspired South African Leonora Thima to create a digital platform where people, mostly women, can speak out and track violence.
Leonora’s relative was just 19 and nine months pregnant when she was killed, her body dumped on the side of a highway near Cape Town in 2020.
“I work in the development sector, so I’ve seen violence,” says Leonora. “But what stood out to me was that the violent death of a family member is seen as so normal in South African society.
“Her death was not published by any news bulletin because the sheer volume of these cases in our country is such that it cannot qualify as news.
The killer was never caught, and what Leonora saw as a silent acceptance of a woman’s violent death became the catalyst for her app Gender Rights in Tech (Grit), which features a chatbot called Zuzi.
It is one of the first free AI tools created by African creators to address gender-based violence.
“This is an African solution developed together with African communities,” says Leonora.
The aim is to offer support and help gather evidence that can later be used in legal cases against abusers.
The initiative is gaining interest among international women’s rights activists, although some warn that chatbots should not be used to replace human support, stressing that survivors need the empathy, understanding and emotional connection that only a trained professional can provide.
Leonora and her small team visited communities in the neighborhoods around her home in Cape Town, talking to residents about their experiences with violence and the ways technology fits into their lives.
They asked more than 800 people how they use their phones and social media to talk about abuse, and what stops them from seeking help.
Leonora found that people want to talk about their abuse but “stay away from traditional routes like the police”.
“Some women would post about it on Facebook and even tag their abuser, only to be served with defamation papers,” she says.
She felt that existing systems failed victims twice, first by failing to prevent the abuse itself, and then again when victims tried to speak out.
With financial and technical support from Mozilla, the Gates Foundation, and the Patrick McGovern Foundation, Leonora and her team began developing Grit, a mobile app that can help people record, report, and respond to abuse as it happens.
The app is free to use, although it requires mobile data to download it. Leonora’s team says it has 13,000 users and had about 10,000 requests for help in September.
At its core, Grit is built around three key features.
There is a large, round help button on the home screen. When pressed, it automatically starts recording 20 seconds of audio, capturing what’s happening around the user. At the same time, it triggers an alert to a private rapid response center – professional response companies are common in South Africa – where a trained operator calls the user.
If the caller needs immediate help, the response team either sends someone to the scene themselves or contacts a local victim organization that can help them.
The app was created with the needs of victims of abuse at its heart, says Leonora: “We need to earn people’s trust. These are communities that are often overlooked. We ask a lot from people when it comes to sharing data.”
grains of sandWhen asked if the help feature has been abused, she admits there have been a few prying eyes — people testing whether it really works — but nothing she would call abuse of the system.
“People are wary. They’re testing us as much as we’re testing the technology,” she says.
The second element of Grit is the “vault,” which Leonora says is a secure digital space where users can store evidence of abuse, dated and encrypted, for possible use later in legal proceedings.
Photos, screenshots and voice recordings can be uploaded and saved privately, protecting important evidence from deletion or tampering.
“Sometimes women take pictures of injuries or save threatening messages, but they can be lost or deleted,” says Leonora. “The vault means the evidence isn’t just in a phone that can be taken away or destroyed.”
This month, Grit will expand again with the launch of its third feature, Zuzi, an AI-based chatbot designed to listen, advise and guide users to local community support.
“We asked people, ‘Should it be a woman? Should it be a man? Should it be a robot? Should it sound like a lawyer, a social worker, a journalist or some other authority figure?'” explains Leonora.
People have told them they want Suzie to be an “auntie figure”—someone warm and trustworthy they can confide in without fear of judgment.
grains of sandAlthough it was created primarily for abused women, during the testing phase Zuzi was also used by men seeking help.
“Some of the calls are from perpetrators, men, who ask Susie to teach them how to get help with their anger issues, which they often direct at their partners,” explains Leonora. “There are also men who are victims of violence and have used Zuzi to talk more openly about what they experienced.
“People like talking to AI because they don’t feel judged by it,” she adds. “He’s not human.”
UN Women reports that South Africa experiences some of the highest rates of gender-based violence (GBV) in the world, with a female homicide rate that is five times the global average. Between 2015 and 2020, an average of seven women were killed every day, according to South African police.
Many, including Lisa Vetten, an expert on gender-based violence in South Africa, agree that it is inevitable that technology will play a role in addressing it.
But she also cautions against using AI in trauma-focused care.
“I call them big language models, not artificial intelligence, because they do linguistic analysis and prediction—nothing more,” she says.
She can see how AI systems can help, but she knows of examples where other AI chatbots have given incorrect advice to women.
“I worry when they give women very confident answers to their legal problems,” she says. “Chatbots can provide useful information, but they are unable to deal with complex, multifaceted difficulties. Most importantly, they are not a substitute for human counseling. People who have been hurt need to be helped to trust and feel safe with other human beings.”
@judith.Litvine/MYGrit’s approach has attracted international attention.
In October, Leonora and her team presented their app at the Feminist Foreign Policy Conference organized by the French government in Paris, where world leaders met to discuss how technology and politics can be used to build a more gender-equal world. At the conference, 31 countries signed a pledge to make addressing gender-based violence a key political priority.
Conversations are buzzing around the use of AI, says Lyric Thompson, founder and head of the Feminist Foreign Policy Collaborative, “but the moment you try to bring gender into the conversation, to bring up the danger of bringing in racist, sexist, xenophobic biases, eyes glaze over and the conversation moves — probably to a back hallway where there aren’t pesky women to bring it up.”
Heather Hurlburt, an associate fellow at Chatham House who specializes in AI and its use in technology, agrees that AI “has huge potential to either help identify and address gender-based discrimination and gender-based violence, or to validate misogyny and inequality,” but adds that which path we take depends “a lot on us.”
Leonora is clear that the success of AI to address gender-based violence depends not only on the engineering, but also on who can design the technology in the first place.
A 2018 World Economic Forum report found that only 22% of AI professionals globally are women, a statistic that is still often cited.
“AI as we know it now was built with historical data that centered the voices of men, and white men in particular,” says Leonora.
“The answer isn’t just about having more women artists. We also need artists who are women of color, more from the Global South, and more from less privileged socioeconomic backgrounds.”
Only then, Leonora Timma concludes, can technology begin to represent the realities of those who use it.
Getty Images/BBC