Home etc Blogs Microsoft And Harvard University Team Up To Launch Open Source Platform For...

Microsoft And Harvard University Team Up To Launch Open Source Platform For Differential Privacy

0
241
  • The tech giant said that a royalty-free license under Microsoft’s own differential privacy patents will be granted
  • As the platform is open source, experts can directly validate the implementation

Microsoft had started the development of a differential privacy platform in collaboration with Harvard University’s Institute for Quantitative Social Science last year. It used the OpenDP Initiative to create an open solution that keeps individual data private. It also aimed to provide researchers with insights based on a large amount of data. Microsoft has announced that the platform has been launched. It has been made available on GitHub.

Julie Brill, CVP, deputy general counsel, and chief privacy officer at Microsoft said, “We need privacy enhancing technologies to earn and maintain trust as we use data. Creating an open source platform for differential privacy, with contributions from developers and researchers from organizations around the world, will be essential in maturing this important technology and enabling its widespread use.”

Differential privacy patents

The tech giant said that a royalty-free license under Microsoft’s own differential privacy patents will be granted. It will help users world, allowing people using the platform. Microsoft’s differential privacy patents to the world through OpenDP. It allows anyone to begin utilising the platform to make their datasets widely available to others around the world globally. Differential privacy does this via a complex mathematical framework that utilises two mechanisms to protect personally identifiable or confidential information within datasets said Microsoft.

It added that a small amount of statistical “noise” is added to each result to mask the contribution of individual data points. This noise works to protect the privacy of an individual while not significantly impacting the accuracy of the answers extracted by analysts and researchers as per the tech giant. It added that the amount of information revealed from each query is calculated and deducted from an overall privacy budget to halt additional queries when personal privacy may be compromised.

Harness and share massive quantities of data

It further said, “Through these mechanisms, differential privacy protects personally identifiable information by preventing it from appearing in data analysis altogether. It further masks the contribution of an individual, essentially rendering it impossible to infer any information specific to any particular person,­ including whether the dataset utilized that individual’s information at all. As a result, outputs from data computations, including analytics and machine learning, do not reveal private information from the underlying data, which opens the door for researchers to harness and share massive quantities of data in a manner and scale never seen before.”

As the platform is open source, experts can directly validate the implementation. Researchers and others working within an area can collaborate on projects and co-develop simultaneously. This helps users to iterate more rapidly to mature the technology.

Gary King, Weatherhead University Professor, and Director Institute for Quantitative Social Science, Harvard University said, “Our partnership with Microsoft – in developing open source software and in spanning the industry-academia divide – has been tremendously productive. The software for differential privacy we are developing together will enable governments, private companies and other organizations to safely share data with academics seeking to create public good, protect individual privacy and ensure statistical validity.”

For those who are unaware of how the system of differential privacy works, it essentially involves the adding of statistical noise to datasets in order to mask and protect the privacy of individuals, while keeping the useful information that needs to be extracted still accurate. At a point where querying data may lead to personal privacy being in a state where it is close to being compromised, additional querying for the data is halted.

The open source nature of the developed platform means that the implementation cannot only be validated, but researchers can also collaborate to help improve the technology in use. Microsoft believes that the insights which are achieved as a result will lead to “an enormous and lasting impact”, and will help in the development of solutions to a variety of problems.

NO COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here