show image

Sooraj Shah

Contributing Editor

Sooraj Shah is Contributing Editor of New Statesman Tech with a focus on C-level IT leader interviews. He is also a freelance technology journalist.

Amnesty International CIO John Gillespie on the use of AI to detect human rights abuses

Amnesty International is a London-based organisation focused on protecting human rights. It claims to have over seven million members, and has operations in more than 50 countries.

One of the key challenges the organisation faces is making local management work together effectively, and technology is a key enabler of this. Managing the technology side of the company is international CIO John Gillespie, who says that getting people all over the globe with different organisations, IT competence and resources to work together is one of the most interesting and challenging initiatives that the organisation has.

Amnesty decided to use Microsoft Office 365 to bring as many of its employees across the globe into a single place where they could all work together as one organisation. The choice of Microsoft wasn’t difficult for Gillespie, who says that aside from Google, there are not many other feasible products available.

“You can hypothesise about open source solutions and other types of collaborations but the more you look into this you’re down to Microsoft and Google,” he says.

Information and the impact on the world

Amnesty has been working on using the information it has to make better management decisions.

“A lot of that is business intelligence  – so you get an understanding of what is going on in the organisation, producing research and also more interestingly what impact that has on the world. The question is about using technology to not just find out what’s going on inside the organisation but what impact it’s having outside – this is crucial for a human rights organisation,” he says.

This is where artificial intelligence (AI) can play a part in helping the organisation to find evidence of human rights abuses from the streams of data available to it, including social media sites such as Twitter, other more traditional media sources and possibly remote sensing satellite imagery. That evidence can then be sent on to its researchers, and then after clarification can be brought to the attention of other organisations or the general public.

“This is about augmenting professionals by providing them with information on human rights abuses – and it’s a powerful avenue for us at the moment,” says Gillespie.

But unlike other industry sectors where one AI tool may be useful for the majority of businesses, charities all face different challenges and have completely different missions.

“If you’re Oxfam running a number of shops then you look more like a retailer, if you’re Red Cross then you look more like a logistics company and if you’re like us you look more like a research and campaign organisation – so the challenge is to look at similar industries and types of organisations and what they’re doing with AI,” Gillespie states.

But the technology is only one side of the equation, according to Gillespie. What is holding back the use of AI is “inspiration or ideas about how these technologies could change what we do and how we do it”.

“Organisations that find an idea and finds uses for AI are going to capitalise on it. We’ve had success with the tools that are available when we have found a need for it and a data set that satisfies it. I don’t think it is the tools that are holding us back,” he says.

The most challenging part is finding the right data.

“If you can find data streams that hold valuable information, and ensure the data is clean it is more critical to success than the tools available,” Gillespie explains.

“We’re doing some work at the moment to analyse Twitter for evidence of abuse against particular female politicians and using machine learning processes to help us,” he states.

While data on Twitter is easier to obtain and analyse, the same can not be said for other social media websites. Gillespie suggests Facebook falls into this category. Amnesty International isn’t just sticking to structured data in the form of words either – the organisation is exploring the use of AI to gain insight into YouTube videos and find out if there are any human rights abuses there too.