Differential Privacy Explained

What is Differential Privacy?

At a time when the risks and costs associated with privacy are on the rise, differential privacy offers a solution.

Differential privacy is mathematical definition for the privacy loss that results to individuals when their private information is used to create an AI product.

Specifically, differential privacy measures how effective a particular privacy technique — such as inserting random noise into a dataset — is at protecting the privacy of individual data records within that dataset.

Differential privacy provides a solution to some core machine learning challenges. It can help to:

Protect personally identifiable information from adversarial attacks
Overcome the cold-start problem
Build trust with customers
Become compliant with privacy regulations

Why Differential Privacy Matters For Businesses

At least half of the companies we talk to say that even though they may have varying degrees of freedom to use the data that they collect, they hesitate to aggregate data across customers.

There’s either some doubt about whether that’s contractually allowed or they’re afraid that their customers won’t be happy if they find out about it.

To get around these issues, some companies try to anonymize customer data by removing personally identifiable information.

However, de-anonymizing data is easier than you might think. High-profile examples of de-anonymization have included the Netflix Prize fiasco and the Massachusetts Group Insurance Commission disclosures in the mid-1990s. The latter resulted in the then Governor of Massachusetts having his medical records identified. More recently, the NYC taxi cab disclosures resulted in the release of detailed location information for 173 million taxi trips.

Differential privacy is an answer to this issue of de-anonymization. To understand how, take a look through the presentation What is Differential Privacy and Why Does it Matter?

 

Differential privacy is an answer to this issue of de-anonymization.

Want to go deeper on the business case for differential privacy?

Download our CEO's guide for a concise explanation of:

  • What differential privacy is and how it works
  • How you can use it to help your company improve your machine learning models
  • How differential privacy can help you overcome the cold-start problem.

Read >

Differential Privacy at Georgian Partners

Differential privacy has been a focus of our applied research practice within the Georgian Impact team for several years.

We've gone deep into the research to help growing software companies leverage a technology that was previously only used at tech giants.

The output of our work on differential privacy includes:

  • publishing research papers at leading machine learning conferences,
  • working closely with our portfolio companies to research and implement differential privacy in their solutions,
  • building software products with Google.

Boosting Model Performance through Differentially Private Model Aggregation

Presented at the 13th Women in Machine Learning Workshop (WiML 2018) colocated with NeurIPS.

Read more

Applying Differential Privacy to Solve the Cold-Start Problem

Delivering value from machine learning products to new customers can be difficult because they have not yet amassed enough data. This is called the cold-start problem.

You can solve this, improve on-boarding times and reduce time-to-value for new customers by using aggregate data and machine learning models from existing customers. But to convince customers to share data or models, you need to guarantee their information privacy and gain their trust.

We set out to solve this problem together with a few of our portfolio companies, taking the concept of differential privacy from academia to production.

“ We’ve been able to go from an onboarding process that took up to six months to one that can be completed almost instantaneously, and with reasonable performance that’s under our typical tolerance rate.”

Andrew Volkov, CTO and Co-founder, Workfusion

Differential Privacy at Workfusion

At Workfusion, a robotic process automation company, the Georgian Impact team collaborated to speed up the onboarding time for new customers.

By adopting Georgian Partners’ differentially private products, Workfusion has been able to transfer key learnings across customers for a large number of processes, while providing provable privacy guarantees. As a result, the company is now able to onboard new customers in just a matter of days.

Watch How We Collaborated with Bluecore on Differential Privacy

Differential Privacy Software

TensorFlow-V4[1]

At Georgian Partners, we’re building software to help our portfolio companies accelerate the adoption of key technologies directly tied to applied research in our investment thesis areas.  We developed our differential privacy software in collaboration with some of our portfolio companies to solve an important business challenge: cross-customer data aggregation.

When we saw Google’s release of TensorFlow Privacy, we immediately saw the opportunity to collaborate and make our differential privacy library publicly available through TensorFlow, the industry’s leading open-source machine learning framework.

To get started with TensorFlow Privacy, check out the examples and tutorials in the GitHub repository. To learn more about differential privacy, check out the CEO’s Guide to Differential Privacy. Finally, to learn how privacy is integral to building and leveraging customer trust, check out the CEO’s Guide to Trust.

Want to learn more?

Listen to this episode of the Georgian Impact Podcast with Applied Research Scientist, Chang Liu to hear about the limits of data anonymization, how you can protect your data and be differentially private and differential privacy’s potential to solve the cold-start problem.

Differential privacy is a technology that’s quickly moving from academia into business. And it’s not just the big companies that are using it. With the…

Read More