Grafvision - Fotolia

Evaluate Weigh the pros and cons of technologies, products and projects you are considering.

To avoid big data privacy issues, user empowerment is a must

The use of big data analytics continues to grow -- and so does the list of consumer privacy risks associated with it. At a recent forum, privacy experts called for equipping users with greater understanding and control over their data.

Aside from individual laws passed by some state legislatures, the collection and analysis of consumer data in the U.S. is largely unregulated. In an age when the potential for exploiting big data has given rise to a host of consumer-related big data privacy issues, experts are urging lawmakers and the technology sector to work together to address this glaring lack of oversight and transparency -- or risk negative consequences.

"Invariably, abuse will happen; invariably, people will find out about it," Carol Rose, executive director at the Massachusetts branch of the American Civil Liberties Union (ACLU), said at a recent consumer data privacy forum at MIT. "The loss of consumer trust, and the damage to the high-tech community, and the potential promise of all these innovations could really be lost if we go down that road."

This warning was the theme at the forum, which was hosted by the Massachusetts Attorney General's Office in partnership with MIT's CSAIL and Internet Policy Research Initiative, and the Berkman Center for Internet and Society at Harvard University. The panel of experts spotlighted big data privacy issues and the dangers of data analytics, as well as the importance of collaboration when creating consumer privacy policies and standards.

The 'privacy paradox'

Catherine Tucker, Sloan distinguished professor of management, MITCatherine Tucker

Finding guidance for how to develop these policies and regulations, however, is teeming with challenges. One major hurdle is what Catherine Tucker, Sloan distinguished professor of management at MIT, called the "privacy paradox": the disconnect between how much data consumers share on their mobile apps or online and how much they profess to care about big data privacy issues.

What Tucker and Ilaria Liccardi, research scientist with CSAIL, found in their respective research is that this inconsistency depends on various factors. Tucker, who studies electronic privacy, brought up one aspect of a bitcoin experiment a team of MIT researchers conducted two years ago where every MIT undergraduate was given broad access to Bitcoin technology to see whether any of them would use it. Each student was given $100 in Bitcoin.

The best way [to gain consumer trust] is to give users absolute transparency and control over the type of data we collect.
Dipayan Ghoshprivacy and public policy advisor, Facebook

As part of the experiment, Tucker and her team asked the students who their friends were because they were interested to learn whom they would exchange Bitcoin with.

"Friendship data is one of the most privacy-intrusive things you can ask for next to a social security number; people are very reluctant to give it out," Tucker said. Her team received a lot of false data, such as fake email addresses.

But when the team took the experiment one step further by offering half of the undergraduates a large cheese pizza and then requesting the friendship information, the results were drastically different.

"That half ended up giving up real emails straight away, no fakery, no insults -- even those who really cared about privacy," Tucker said. 

Ilaria Liccardi, research scientist, CSAILIlaria Liccardi

Liccardi's research for MIT CSAIL added another layer to the privacy paradox. Liccardi found that the more information she provided to the participants regarding the app's permissions -- even non-explanatory information such as the fact that the app might access user data because of a non-specific reason -- the less participants were willing to share. However, when there was no information provided whatsoever, they shared more.

There was also the tradeoff users were willing to make if they already trusted the app because it already accessed much of their data.

"They ask, 'Do we actually care if the app has previous information?" [For example], people tend to share with Facebook, because Facebook already has a lot of information about us. So they say, 'Yes, I'm willing to give information; I trust them,'" Liccardi said.

The consumer privacy paradox is potentially problematic down the road, particularly for certain types of data, Tucker said. She offered the example of being high-risk for developing specific diseases. That data could make finding employment or health insurance difficult.

"This kind of data is very dangerous because of the discrimination and potential consequences," Tucker said.

Despite these challenges to studying user behavior, Liccardi said it's crucial that researchers continue to study it.

"What we want to do is understand these kinds of tradeoffs so we can actually empower users to give them control," she said.

An environment of trust

Facebook prides itself in providing this type of empowerment, which it fosters by creating an "environment of trust" with its open privacy policy, said Dipayan Ghosh, privacy and public policy advisor for the technology firm.

Dipayan Ghosh, privacy and public policy advisor, FacebookDipayan Ghosh

"The best way [to gain consumer trust] is to give users absolute transparency and control over the type of data we collect, the type of the data that [they] share with us, and ... in how it's shared and how it's used; so we have an open platform for users to be able to tweak their policies," Ghosh explained.

Facebook continues to develop these standards and policies by working with academics from MIT and other institutions to make its emerging technology products efficient and secure for users, Ghosh said. The company also has an internal privacy team that collaborates with product, engineering, marketing and advertising teams to understand the privacy implications of these products. Moreover, it works with the legal and policy departments to address consumer privacy and prevent consumer harm.

Even in fields like healthcare, where there are greater restrictions on using customers' data, organizations grapple with fostering trust. Take HIPAA, for example. The legislation requires healthcare organizations to safeguard patient privacy, while at the same time making that data accessible to patients so they can communicate with these organizations. However, many organizations use HIPAA as an excuse to withhold data from patients, according to John Moore, CEO at Twine Health.

John Moore, CEO, Twine HealthJohn Moore

"The forces that be, the healthcare institutions, have a vested interest in hoarding that data ... to retain patients in their network in order to continue to churn office visits; that is the way the system currently works," he said.

It's dangerous for this reality to persist in a world where as soon as five years from now the vast majority of health data  will be produced and transmitted by patients via various mobile devices and applications. 

In other words, this is healthcare happening in the fast-paced real world, and not just in a hospital or an office. It would be in the best interest of patients to not just be able to access a bulk of their data every one or two years but to access it continuously so they can communicate with healthcare providers in real time, Moore said.

But patients must first trust their providers enough to share this real-time data, and healthcare organizations need to start instilling that sense of trust now.

"As soon as you provide [patients] an experience that helps them solve problems in their health and in their life, they exhibit drastically different behaviors. They will choose to share things above and beyond your expectations," Moore said.

Next Steps

FTC: The consumer privacy risks of big data analysis

The widespread impact of health data privacy regulations

More on the MIT consumer privacy forum by SearchBusinessAnalytics

This was last published in April 2016

Dig Deeper on Managing governance and compliance

Join the conversation

1 comment

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

What steps does your company take to avoid the consumer privacy issues of big data analytics?
Cancel

-ADS BY GOOGLE

SearchCIO

SearchHealthIT

SearchCloudComputing

SearchDataCenter

SearchDataManagement

SearchSecurity

Close