Home > Cybersecurity > UN wants to improve privacy enhancing technologies (PETs)
The United Nations statistics division, UN PETs Lab, is working on a project to develop new techniques aimed at improving data privacy. Officially inaugurated on 25 January, the lab will work with publicly available data from UN Comtrade and cooperate with national statistical offices from the United States, Britain, Canada, Italy, and the Netherlands, academic researchers, and the private sector to test so-called privacy-enhancing technologies (PETs).
According to The Economist’s website, the lab has already conducted tests of several categories of PETs. One involved an Oxford-based charity and a technique called secure multi-party computing (SMPC). The approach assumes that the data to be analyzed is encrypted and remains on the premises of its owner. The organization doing the analysis – in this case, OpenMined – sends its algorithm to the gatekeeper who will process the encrypted data.
In another test, Dublin-based Oblivious Software tested “trusted execution environments”. The same data sets are encrypted by their owner and then sent to a highly secure server so that processing can be tracked, and the data completely removed on completion of the job.
“Senior leaders are currently talking about privacy-enhancing technologies aimed at cross-border and cross-sector collaboration to solve shared challenges. With PETs, it will be possible to protect shared values such as privacy, accountability, and transparency. They will be essential to help improve official statistics and support democratic societies, honoring citizens’ right to reliable public information,” said Stefan Schweinfest, Director of the UN Statistics Division.
The UN PETs Lab plans to deepen testing with more data and recruit more agencies to join its list. The US and Britain plan to launch a “grand challenge” to PET systems and reward the best.
Os PETs visam permitir a partilha segura de dados utilizando encriptação e outros protocolos para os processar sem ter de os decifrar. Asseguram igualmente que os dados são protegidos ao longo de todo o seu ciclo de vida.
Decisions made on key issues such as economics, environment and health can benefit from data provided by other countries, hence the involvement of entities like the United Nations in this field. For example, data used in training Artificial Intelligence systems and statistical models to improve medical diagnoses or important indicators on the performance of national economies can be shared securely and privately when PET techniques are applied. Research indicates that up to US$ 3 trillion of global GDP could be unlocked by greater international data sharing.
However, the absence of privacy-enhancing technologies coupled with strict privacy regulations limits the ability of governments and institutions to share valuable information. As a result, very little of the data generated globally is being used for analytical and collaborative purposes.
“An accurate picture of the global picture remains vital to provide statistics on the health and performance of economies, for example. Current methods of collecting and disseminating statistical information are not sustainable, so privacy-enhancing technologies are important. We are investing today in practical ways to test these emerging technologies because we believe they will provide a vision of how we can continue to serve the population with reliable data while maintaining privacy and security,” said Ron Jarmin, deputy director of the US Census Bureau.
PETs can encompass various techniques to extract value from data without compromising privacy or security. Below are some of them:
1) Homomorphic Encryption: Allows data to be processed without having to be decrypted. In this way, encrypted data can be transferred and analyzed so that the results of the analysis can then be shared freely. What is the main application? Confidential information that today can hardly be processed by third parties for analysis purposes can be shared.
2) Secure Multi-party Computing (SMPC): It is a subtype of homomorphic cryptography that allows processing encrypted data from various sources. For machine learning models, this feature is welcome, because the larger the volume of data, the better the results.
3) Differential privacy: Guarantees protection against sharing information about individuals. It allows patterns to be described in groups extracted from a set of data while maintaining personal privacy. 4) Zero-Knowledge Proofs (ZKP): Can verify the veracity of information without needing to be revealed.