Surveillance has become so ubiquitous that it appears likely that Russia was caught in the act conspiring to fix the 2016 United States presidential election, and at least one of his staffers was basically overheard conspiring with them.
Politicians aren’t the only ones being watched. Edward Snowden’s 2013 revelations detailing the US National Security Agency’s widespread surveillance have made clear that, these days, everyone should be thinking about privacy and security.
That includes academics, some of whom are undertaking sensitive, even dangerous, research. How can we work safely and ethically in an era of internet spying and wiretapping?
Weaponising your own research
This question is particularly salient for scholars who work on peace and justice organising: recent leaks confirm that the military (or the police) may not only be reading your published work – they could also be tracking your online activity, monitoring your whereabouts and even listening in on your conversations.
Exposed files from the IT security company the Hacking Team confirm that its software is widely used around the world to listen to ambient conversations held in a room with a cell phone, even when it is off.
That opens up the ethically distressing possibility that your research can be weaponised – used by armed actors do to harm.
Geographers are particularly vulnerable to this threat. In 2007, the American Anthropological Association denounced the US Army’s Human Terrain Systems, which embeds social scientists in military teams in Iraq and Afghanistan, as “an unacceptable application of anthropological expertise”. Since then, the US military’s attempts to know (and control) the so-called human terrain have shifted to geography.
As a result, we see a fast-growing trend of geographers being offered military funding for research, often through front organisations such as the US Department of Defense’s Minerva Research Initiative.
The army’s new favour for geographers was reinforced when the American Association of Geographers (AAG) for years refused to take action on a military-related scandal. Researchers led by Peter Herlihy at the University of Kansas who were doing participatory mapping with indigenous groups in Oaxaca, Mexico, failed to disclose both their US military funding and the fact that they were thus sharing research findings with their donors.
That’s unethical anywhere, but it’s particularly problematic in Oaxaca: the US military likely shared that detailed GIS information about Zapoteco communities with the Mexican military, which has long repressed those indigenous communities.
In early April 2017, the AAG finally agreed to form a study group to examine the issue of ties between their discipline and the US, UK and NATO armed forces.
Even if you’re an academic who doesn’t accept military funding, your findings may already have been added to the military’s huge databases without you knowing it (the citation is unlikely to come up in a Google Scholar search).
Karen Morin of Bucknell University, for example, discovered that her chapter on interpreting landscape had been cited in a Marines operational guide. Its subject: reading the cultural landscape correctly can enable troops to immediately control a population upon arrival.
It is very hard to track down this sort of misappropriation of your work. But you can keep it in mind when publishing. Ask yourself: who might want this information, and could it in any way be used to do harm?
Academics should also be aware that unpublished research data can also be hacked. I found this out the hard way, when the email account of the Fellowship of Reconciliation, a group that I was doing research with and regularly emailing, was hacked by Colombian intelligence and their emails used to prosecute a human rights activist on trumped-up charges.
Protect yourself (and your sources)
These basic steps can prevent your data being similarly hacked and misused.
1) Add two-step verification to your email. For Gmail, simply select this option under preferences under security, and then when you log in from a new computer it will ask you to enter a code texted to your phone. You can also download a list of ten backup codes to use when you are away from cell coverage.
2) Encrypt your computer. Or, more realistically, encrypt one folder on it, which is where you will store those backup codes and other secure information. Beware that encryption will slow older computers down. Also encrypt all data every time you do a backup, and set up two-step verification on backups.
3) Put away your phone. You can now record long interviews on most phones. But if you at all suspect that the content of that interview could be misused in any way, by anyone, and particularly by armed actors, use a small digital recorder instead.
4) Get away from your phone. Simply turning off your phone is not enough; hackers can still record ambient conversations. A safer bet is to keep the phone outside of the room. (Remember to also take along another timepiece if you usually depend on your phone for that.)
5) Destroy the evidence. When your write field notes by hand, snap a photo of them and save the images behind encryption, then destroy your hard paper copy.
Do I sound paranoid? Most researchers, after all, are hardly embarking on James Bond-like missions.
Think what you like, but recent revelations have shown that governments around the world have purchased software for listening to conversations in the room through your smart phone.
The community organisers, political activists, rogue scientists, indigenous rights defenders and environmentalists we routinely talk to as part of our research can become targets of government retaliation.
Given the high levels of surveillance and the growing weaponisation of research, caution is warranted. What it means to do ethical research has changed, and that should be reflected in both our own research methods and our methods classes.