Privacy and Black Tech
- May 31, 2018
- Clayton Rice, K.C.
On May 28, 2018, the General Data Protection Regulation of the European Union became law aiming to standardize the protection of personal data across Europe. But it is not a panacea. The new GDPR permits member states to enact derogations in accordance with national needs and no data protection law can protect us from ourselves or black tech.
However, the new law allows EU member states to enact modifying legislation in key data privacy areas such as the age of child consent, and the processing of biometric data and national identification numbers. As of May 25, 2018, Austria, Germany, Belgium and Slokavia passed supplemental data privacy laws while draft bills are being considered in France, Ireland, Spain and the United Kingdom. And the European Union is negotiating the ePrivacy Regulation which will compliment the GDPR “by setting forth rules regarding subject matters that are not within” its scope. (See: Ali Cooper-Ponte. GDPR Derogations, ePrivacy, and the Evolving European Privacy Landscape. Lawfare. May 25, 2018; and, Natasha Singer. The Next Privacy Battle in Europe Is Over This New Law. The New York Times. May 27, 2018))
And, as Ms Cooper-Ponte emphasized in her discussion of derogations in her Lawfare piece, the GDPR codifies a broader right to be forgotten than established by the Court of Justice of the European Union in Google Spain v Gonzalez C-131/12 that I reviewed previously on this blog. In Google Spain, the court held that Google is a data controller and could be required to remove information from its search results to protect the right to private life under Article 8 of the European Convention on Human Rights where the results were “inadequate, irrelevant…or excessive in relation to the purposes for which they were processed and in light of the time that has elapsed”. The GDPR now allows data subjects to request removal of personal data in situations where the subject withdraws consent. (See also: On The Wire. The Right to be Forgotten. December 16, 2014)
The GDPR should not only be of interest to Europeans but to Canadians as well. Companies can no longer bury customer data deep in the well of terms-of-service agreements. They must be explicit about the personal data collected and may only use it for the primary purpose intended. “If they want to use it for something else later on,” said Ann Cavoukian, founder of Privacy by Design and former privacy commissioner of Ontario, “they have to come back to you and get positive consent.” According to Robin Mansell of the London School of Economics and Political Science, any company that collects data from European citizens or trades with European companies must comply – including major tech companies, digital media platforms and Canadian companies doing business in the EU. (See: Ramona Pringle. Europe’s tough new data privacy laws will benefit Canadians, too. CBCNews. May 24, 2018)
But, as Ms Powles said in her article for The New Yorker, the new law promises “information hygiene” from large data operations but gains less at the individual level. “Data protection is sold to Europeans as a tool for balance, equality, and autonomy in the digital world,” she wrote. “But it is also a highly individualized regime; the actions of any one person are unlikely to effect change, and so it is comparatively easy for us, as a collective, to yield certain concessions out of convenience, ignorance, or resignation.” There is a “circumspect note to the law’s aspirations”. European parliamentarian and GCPR facilitator, Jan Philipp Albrecht, put it this way: “No data protection law protects us from ourselves.”
The defence of privacy requires more than a strong data protection law. Frederike Kaltheuner of London-based Privacy International has identified a “troubling pattern in how police adapt new technologies”. Every time there is a new technology, there is a tendency to use it without necessary safeguards – from surveillance devices targeting mobile phones to facial recognition cameras to mobile phone extraction. Emerging privacy threats, such as the deployment of emotion detection technology in public places, don’t necessarily involve personal data and sometimes the harm is done to entire segments of society. “Data protection organized around the individual is not always able to effectively protect us all from these more collective harms,” Ms Kaltheuner wrote. “Ultimately, it’s important to remember that data protection is about power.” (See: Frederike Kaltheuner. Privacy is power. Politico.EU. May 27, 2018)
Five days after the General Data Protection Regulation became law, the China International Exhibition on Police Equipment was underway in Beijing. Reporting for Reuters from the world’s largest surveillance state, Pei Li and Cate Cadell said this about the latest in black tech: “It can crack your smartphone password in seconds, rip personal data from call and messaging apps, and peruse your contact book.” The XDH-CF-5600 scanner, or mobile phone sleuth, is manufactured by Xiamen Meiya Pico Information Co Ltd, a Chinese provider of security products and devices. (See: Pei Li and Cate Cadell. At Beijing security fair, an arms race for surveillance tech. Reuters. May 30, 2018)
The mobile phone sleuth wasn’t the only scanner on display at the Beijing fair. Hisign Technology apparently has a desktop and portable phone scanner that can retrieve “deleted data from over 90 mobile applications on smartphones, including overseas platforms like Facebook and Twitter”. Hisign also claimed the ability to obtain data from Apple’s iOS operating system. Several other firms told Reuters they could “crack 4-digit passwords on platforms ranging from iOS 6 to iOS 8, and were working to break through security of the latest iOS 10 platform”. Meiya Pico, a Hisign rival, offers the DC-8811 Magic Cube which it markets as “the Swiss Army Knife of forensics” and the larger FL-2000 as the “forensic aircraft carrier”. According to Li and Cadell, the vendors didn’t demonstrate the capability of cracking other Apple systems that use a stronger 6-digit password.
Liu Haifeng, Vice General Manager of Xindehui, a Meiya Pico subsidiary, told a roomful of police officers at the Beijing event that he sees surveillance technology as a positive. “It is impossible for people, especially the younger generations, to live without electronics,” he said. Therefore, suspects trying to escape “can never get away”. But Mr. Haifeng is wrong. A society under relentless surveillance may be superbly armed to fight crime but it would be one in which there is no residuum of privacy. The problem with this kind of technology is that it often captures data indiscriminately – from targets and innocent bystanders who are not suspected of any crime. The forbidding reality that one may “never get away” was at the core of an article titled Does China’s digital police state have echoes in the West? published in the Leaders section of the May 31, 2018, edition of The Economist that began:
“When you walk to work, CCTV cameras film you and, increasingly, recognize your face. Drive out of town, and number-plate-reading cameras capture your journey. The smartphone in your pocket leaves a constant digital trail. Browse the web in the privacy of your home, and your actions are logged and analyzed. The resulting data can be crunched to create a minute-by-minute record of your life.
Between freedom and oppression stands a system to seek the consent of citizens, maintain checks and balances on governments and, when it comes to surveillance, set rules to restrain those who collect and process information. But with data so plentiful and easy to gather, these protections are being eroded. Privacy rules designed for the landline phone, postbox and filing cabinet urgently need to be strengthened for the age of the smartphone, e-mail and cloud computing.”
How, then, to balance freedom and safety? The Leaders piece made three suggestions. First, start by ensuring that the digital world, like the real one, has places where law abiding people can enjoy privacy. Encryption should not be curtailed for mobile phones. Second, limit how long information on citizens is kept, constrain who has access to it and penalize misuse fittingly. Third, monitor the use of artificial intelligence. Predictive policing systems are imperfect. AI trained with biased data will produce biased results. Some sentencing algorithms are more likely to label black defendants than white ones as being at high risk of reoffending. Such algorithms must be open to scrutiny, not protected as trade secrets.
Vigilance and transparency are the road signs. But it’s a long road.