Centre for Internet & Society

At the start of his presentation, Sunil Abraham pointed to two aerial drawings of cybercafes: one where each computer was part of a private booth, and one where the computers were in the open so the screens would be visible to any one. Which layout would be more friendly to women, and why, Abraham wanted to know. Some participants selected the first option, liking the idea of the privacy, while others liked the second option so that the cybercafe owner would be able to monitor users’ activities.

Abraham said he was surprised no one said option one looked like masturbation booths, adding that in May, India passed rules prohibiting the first design option to avoid just such an issue. This is despite a survey conducted of female college students, who liked the idea of privacy in cybercafés that typically are male-dominated. 

Cybercafes are just one of the areas impacted by India’s plan for collecting and using biometrics to create unique individual identification cards.

Abraham focused his presentation on activists’ efforts to counter the government’s myths about a unique identification (UID) program.

One campaign image showed two soldiers on the border asking for an east-Asian looking person’s identification. The way to balance, or rectify, the drawing, Abraham said, would be to allow citizens to be able to ask the soldiers for the identification information.

The campaign, “Rethink UID Project,” included several images illustrating various problems with the plan. For example, one said: “Central storage of keys is a bad idea, so is central storage of our biometrics.” As Abraham explained, if storing a copy of your housekey at the police station does not make us feel more secure, then why wouldn’t storing our biometrics with the government also make us a little more scared?

In the Indian scheme, Abraham said, the government says biometrics will be used as an authentication factor in order to prove your identity, but from a computer science perspective, it’s a bad idea because it is so easy to steal biometrics. And, as Abraham pointed out, if your biometrics are stolen, it’s not possible for you to re-secure it—it’s not like getting a new ATM card and password, he said.

If this system of national UID was designed using digital keys instead of biometrics, then we would have a completely different configuration, Abraham said.

Centralized storage is nonnegotiable, and therefore the process of authentification is done through a centralized database, but with digital keys or digital signatures, authentification could be done on a peer basis, so citizen could authenticate border guards and vice versa.

Another image from the “Rethink UID Project” campaign pointed out that “Technology cannot solve corruption.” As Abraham said, problems of corruption in the subsidy system (food, loans, education, employment guarantee act in rural India, etc) won’t be fixed with biometrics. For example, if biometric equipment is installed at fair-price shops, before the shop owner gives the grain, the citizen would have to present biometrics, which would go through a centralized server and be authenticated, then the citizen would get the grain, and ultimately there would be a record saying this particular citizen collected this amount of subsidized grain at this particular time.

But there are a whole range of ways shop owners can compromise the system, Abraham said.

The first way: 30-50 percent of India is illiterate, so shop owner can say the biometrics were rejected by the server and the citizen would not know better. Or, the owner can say there was no connectivity so authentification didn’t go through, or the owner could say there was no electricity so the system won’t work, or the shop owner could give just part of the grain that the citizen is due.

Corruption innovates and terrorism innovates—if technology innovates, so does corruption, as it is not a static phenomenon, Abraham said. You can’t wish away human beings from technological configurations.

One village will have multiple biometric readers.

Abraham said they have proposed an alternative schema: remove readers from the shop, school, hospital, bank, etc., and have only one scanner at the local governance hall. Instead of the citizen becoming transparent to the government, the government should become transparent to the citizen. The shop owners should make transparent which IDs they have given how much grain to, and only if they are going to dispute the ID of a citizen, can they go to the local government administrative office to prove the ID.

Another image from the “Rethink UID Project” campaign said, “The poor and the rich: who do we track first?”

Abraham explained that one problem in India is “black money,” or money for which you don’t pay taxes because the accounts are in fake names in order to store money. Like creating fake bank accounts, he said it also would be easy to create fake biometrics by combining the handprints and eyes of multiple people to get a second fake ID. Also the system could be hacked into and iris images Photoshopped. Ghost ideas also could be created and then sold off. Because the rich will get their IDs behind closed doors, Abraham said, it will be easy for them to get multiple IDs, but the poor will not be able to.

Referring to “tailgating,” or when one ID is card swiped to gain entrance for multiple people, such as swiping one metro card and then two people walking through, Abraham noted that the problem is that the tailgating only is seen as a problem when it’s at the bottom of the pyramid, such as one woman goes to the fair-price shop to collect grain for five or six families so only one person has to lose a day’s wage instead of all five or six losing a day’s wages. Tailgating at the bottom if the pyramid is usually a question of survival, he said.

Thus, another image from the campaign showed a pyramid and said, “Transparency at the top first…before transparency at the bottom.”

The first principle is that expectations of privacy should be inversely proportional to power, so people who are really powerful, like NGOs, politicians, or heads of corporations, should have less privacy, and people who have very little power should have more privacy, Abraham said.

Also, from a business perspective, the nation gets greater return on its investment if surveillance equipment is trained on people at the top of the pyramid to catch big-time corruption, he said.

Most of the panic around the UID is over the transaction database. Beyond a databse storing everyone’s biometrics, another database will track transactions: every time you buy a mobile phone or purchase a ticket or access a cyber cafe or subsidies, thanks to UID, there will a record made in the transaction database, Abraham said.

Abraham said it is important to note that surveillance is not an intrinsic part of information systems, but once surveillance is engineered into information systems, both those with good intentions or bad intentions can take advantage of that surveillance capability.

The UID means there will be 22 databases available to 12 intelligence agencies, he said.

So when a girl enters into a cybercafé, first she will have to provide her UID, and then the café owner will photocopy the card, then the owner has the right to take a photo of the girl using his own camera, then the owner is supposed to maintain browser logs of her computer for a period of one year.

So the question then is how to assure accountability without surveillance?

The first possibility, Abraham said, is partial storage. The transaction database could store half the data, and the central database could store the other half, so the full 360-view of the data would not be available without a court order.

The second solution is a transaction escrow, where every time a record is put into the main database, it will be encrypted using 2-3 keys, and only if 3 agencies cooperate with keys, can the information be decrypted. Thus, it is targeted surveillance, not blanket surveillance.

To conclude his presentation, Abraham divided participants into four groups in order to design surveillance systems for internet surveillance, mobile technologies, CCTVs, and border control.

Sharon Strover spoke on behalf of the CCTV group, saying they ended up with more questions than anything else. They agreed there should be notices when cameras are in use, there should be public knowledge of who is doing surveillance and who has access to the footage, and the data shouldn’t be sold. But the group couldn’t decide which spaces warranted CCTVs and which not.

Abraham then pointed out that the next generation of CCTVs can read everybody’s irises as they pass the cameras—it’s in the lab now, and 2-3 years from the market, he said.

Next, Andy Carvin spoke on behalf of the mobile technologies surveillance group. Whether or not capturing metadata or content as well, the mobile phone company can collect it, but it shouldn’t be able to keep any identifiable information for the person – it should only be able to look at information in the aggregate. The rest of the information should be shipped to a non-governmental organization or government agency specialized in privacy, and 2 keys would be required: one from the judiciary and one from the NGO or governmental agency.

Smári McCarthy reported back for the Internet surveillance group, pointing out that data retention has been useful in criminal cases less than 0.2 percent of the time in one study, and another showed there has been no statistically significant increase in the number of criminal cases solved because of data retention. So, he said, the group concluded there should be no blanket surveillance, only court orders in certain criminal cases that define who will be under surveillance and for how long. Also, they wanted to see a transparency register available so the public could be informed about how many people are under surveillance currently and throughout year and other general information, such as the success rate—how many of these surveillances have led to criminal convictions or similar.

Finally, Summer Harlow spoke on behalf of the border control group, which said scanning of checked- and carry-on luggage is acceptable, but there should be no luggage searches without specific probable cause from intelligence agencies or if the scans pick up weapons or other contraband. Similarly, people could be subject to spectrum scans and drug/bomb sniffing dogs for weapons and contraband, but again they would not be physically searched by border agents without probable cause. Also, people and luggage could not randomly be searched based on the country of their passport or their flight destination or origin.

In summary, Abraham said, surveillance is like salt in food: it is essential in small amounts, but completely counter-productive if even slightly excessive. 

  • Download Sunil's presentation here [PDF, 1389 kb]
  • Sunil Abraham made the presentation at the Gary Chapman International School on Digital Transformation on 21 July 2011. The original news published by International School on Digital Transformation can be read here
  • Read the schedule here
Filed under: