July 16, 2019
Most of us are familiar with the basic mode of cyberattacks intended to gather our personal information. A database at a bank, credit monitoring company, hospital, social media network, etc., is hacked, and data such as account information, Social Security numbers, addresses, medical records, passwords and more land in the hands of people we’d rather not have access.
But what about when the data is secure, tucked away comfortably from those types of attacks? There are other potential avenues through which malicious actors could gain our personal information. And a Mizzou Engineering researcher is working to ensure that doesn’t happen.
Rohit Chadha, associate professor of Electrical Engineering and Computer Science, is working with researchers at the University of Illinois-Chicago and University of Illinois-Urbana Champaign on a $1.2 million grant from the National Science Foundation on verification of differential privacy mechanisms.
While hacking databases is the main way for interested parties to gain users’ personal information, it’s not the only possibility. Intrepid attackers can use perfectly benign means to do so. How? By using readily available aggregate data — for example: census data, medical data focused on how many people in an area suffer from a specific illness, consumer trend data, etc. — and using it to focus on specific individuals.
The goal of differential privacy is to make sure that data can be used for research without leaking any individual’s specific information. As an example, this means you can find out how many people make more than $100,000 per year in salary in a given city, but not exactly if a specific individual in in that list of people.
Asking valid, benign questions to enough of these databases may eventually triangulate an individual — allowing people to figure out how much money you make, your specific medical issues and other personal information.
“Differential privacy is a technique invented about a decade and half ago that basically tries to ensure that even when you ask legitimate databases questions about aggregates and other things, the privacy of individual records remains intact,” Chadha said.
There are methods put in place by researchers to eliminate this sort of identification. Chadha, along with longtime collaborators Aravinda Prasad Sistla of UIC and Mahesh Viswanathan of UIUC, will spend the duration of the grant verifying the correctness of these methods and to improve those which need tweaking.
“You want to still answer those questions because they are important for researchers, for businesses and governments. The point is that you still want to protect individual privacy even when you have these kind of queries,” Chadha explained.