Fotolia

FTC: Analyzing big data creates discrimination risk

Big data analytics provides several business benefits but could also discriminate against certain individuals and violate consumer data protection rules.

Big data presents irresistible business opportunities for companies, but it also creates potential risk such as inadvertent discrimination against groups and individuals. One such example is when Hurricane Sandy hit in 2012: The storm prompted more than 20 million messages on Twitter between Oct. 27 and Nov. 1 of that year, creating a wealth of data about the storm and the people affected by it. Big data analytics could have been employed to help determine where to send rescue teams, clean-up crews and other services. Doing so, however, likely would have caused confusion and delay because few of the tweets came from the hardest-hit areas. As the disaster progressed and power outages dragged on, people in the most severely impacted places sent few Twitter messages.

The massive data set created by Hurricane Sandy-related tweets led to an increasingly inaccurate picture of the disaster's impact and the places in greatest need of help, according to a new report from the Federal Trade Commission (FTC). The federal government is now warning companies that compile, sell or use big data consisting of consumer information must make sure their practices and policies comply with laws designed to protect groups from discrimination or exclusion. In a new report titled "Big data: Tool for Exclusion or Inclusion?" the FTC provides questions that businesses should ask to ensure that analyzing big data does not violate consumer data protection or equal opportunity laws.

This FAQ is part of SearchCompliance's IT Compliance FAQ series.

How could analyzing big data result in discrimination or exclusionary practices that violate federal law?

Analyzing big data comes with unique risks, including the possibility of incorrect predictions. If a company's policies or practices mistakenly rule out particular consumers when determining eligibility for credit, employment or educational opportunities, they could violate federal laws designed to protect certain groups from discrimination.

The Federal Trade Commission cautions that if big data analytics contributes to decisions regarding one consumer based on the actions of other consumers with shared characteristics, they could mistakenly deny opportunities to people. The potential harm is that disparities among groups can be reinforced and new disparities can be created. Traditional rating models determine risk based on credit-related variables, but analyzing big data can lead to determinations based little on actual credit risk. For example, if a company uses predictive analytics to determine a consumer's creditworthiness, unfair characteristics such as ZIP codes or social media usage may be considered.

Related content
FTC report warns about exclusion in big data analytics
Are big data analytics harmful to consumers?

What laws should businesses bear in mind when employing big data practices?

The Fair Credit Reporting Act, the Federal Trade Commission Act and several federal equal opportunity laws may apply to how a company analyzes big data.

The Fair Credit Reporting Act (FCRA): Consumer reporting agencies that collect and sell reports containing consumer data used to make decisions about credit, employment, insurance, housing or benefits eligibility are subject to the FCRA. These agencies are required to maintain reasonable procedures to make sure their reports are as accurate as possible, and they must give consumers access to their own data and the ability to correct mistakes.

Federal Trade Commission Act (FTCA): Section 5 of the Federal Trade Commission Act presents an area of potential non-compliance for businesses using big data analytics: A big data practice that is likely to cause substantial injury to a consumer (as long as the injury is not reasonably avoidable and is not outweighed by benefits) is considered unfair. Businesses must ensure that they do not break any promises in regards to sharing consumer data and that they disclose only material information.

Equal opportunity laws: There are several federal equal opportunity laws that companies should review when employing big data practices. These include the Equal Credit Opportunity Act (ECOA), the Americans with Disabilities Act, the Age Discrimination in Employment Act, the Fair Housing Act, the Genetic Information Nondiscrimination Act and Title VII of the Civil Rights Act of 1964. Under these laws, companies may not discriminate based on race, color, gender, religion, age, disability, national origin, marital status and genetic information.

Under the ECOA, a credit company may not treat a consumer differently based on these "protected characteristics." Also, creditors that use big data in transactions must comply with ECOA requirements regarding data retention and information requests.

Related content
Fair Credit Reporting Act designed to assure credit reporting is accurate, fair
Analyzing big data raises numerous legality questions

What questions should companies using big data consider to avoid discriminating against or excluding certain groups?

How advanced analytics fuels organizations

Companies that employ big data analytics should ask whether their policies and practices treat people differently based on protected characteristics such as race or religion. Do their policies or practices have an adverse impact on people in a protected category? Are the company's consumer data protection promises being honored? Are consumers being given material information about those practices? Is consumer data being reasonably secured?

Big data related to consumers should be secured commensurate with its sensitivity and volume, the size and complexity of the company handling the data, and the cost to secure the information. If a company keeps medical data or Social Security numbers for individuals, for example, it should have considerably more robust security measures than a company that keeps only names.

The FTC advises companies to make sure data sets are representative before applying analytics. Data about certain populations -- such as people who aren't tech-savvy, people who aren't deeply involved in the formal economy and people who don't like to share information -- may be missing. Companies should be wary of data models that hide biases, and they should not mistake statistical correlations for causation, according to the FTC. Finally, the FTC suggests that businesses should review the factors that go into their analytics model and weigh the model's predictive value against its fairness to consumers.

Related content
New FTC report questions consumer data use, protections in the age of big data
The keys to avoiding big data bias

What compliance questions should companies that sell, buy or obtain big data ask to ensure compliance with federal laws?

Companies that compile big data for others to use in deciding whether a consumer is eligible for credit, employment, insurance, housing, government benefits and the like need to ask whether they comply with FCRA's privacy and accuracy provisions. They must have procedures in place to make sure their data is as accurate as possible, allow consumers access to their own data and let them correct errors. Additionally, they must ensure that the big data they sell is not being used for fraud or discriminatory purposes.

Companies that obtain big data from other companies to decide consumer eligibility also must make sure they comply with FCRA provisions. If a company uses big data for employment decisions, for example, it must certify that it has a permissible purpose to receive the data, that it won't be used in violation of equal opportunity laws and that it provides all relevant notices to consumers.

Related content
Seizing big data opportunities while protecting consumer information
Singapore, UK researchers investigate big data privacy

Next Steps

Check out previous SearchCompliance FAQs to learn how companies' internal controls have become essential to remain FCPA compliant, and why Honda revamped its compliance reporting processes after IT failures led to TREAD Act violations.

This was last published in February 2016

Dig Deeper on Industry-specific requirements for compliance

PRO+

Content

Find more PRO+ content and other member only offers, here.

Join the conversation

1 comment

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

What steps does your organization take to ensure big data analytics does not violate consumer data protection rules?
Cancel

-ADS BY GOOGLE

SearchCIO

SearchHealthIT

SearchCloudComputing

SearchDataCenter

SearchDataManagement

SearchSecurity

Close