Consumer Data Issues

Background

In recent years, the amount of personal information that is collected, used, shared, and sold has skyrocketed. Virtually all companies now collect some personally identifiable information (PII). Moreover, data-driven companies collect unprecedented amounts of PII. This trend is only expected to accelerate. With the proper safeguards in place, the proliferation of personal information brings the promise of significant innovations that will benefit individuals, groups, and the broader society. But it also brings challenges. These include ensuring that all populations can benefit from the innovations that result from the proliferation of data, preventing the misuse of data, and safeguarding data privacy and security.

Ensuring all populations benefit:

The proliferation of data provides benefits to individuals and society. Individuals are able to enjoy a much more personalized and customized consumer experience. They receive uniquely tailored goods and services. For example, websites and apps provide personalized recommendations of which books and articles to read, movies and television shows to watch, music and podcasts to listen to, clothes to try on, workouts to try out, recipes to cook, and many other possibilities.

At the societal level, the increase in data brings many benefits. For example, data-driven policymaking can lead to better government programs that more effectively address key societal problems. It can also help distribute resources more equitably and efficiently. Analysis of data sets can identify misallocations of resources or discriminatory action. Likewise, companies can analyze data to create products and services that bring enormous consumer benefits.

In particular, the application of artificial intelligence (AI) is leading to new innovative products and services. For instance, AI is powering automated vehicle technology that will eventually lead to fully self-driving cars. This can expand access and reduce isolation for people who do not drive, including some older adults (see also Vehicle Automation and Fully Self-Driving Cars). AI also powers digital assistants like Siri and Alexa. And it is transforming healthcare, financial services, and other sectors of the economy (see also Data Privacy, Security, and Use for more information on healthcare privacy issues).

However, if not deployed intentionally, some residents will not benefit from the positive uses of data. In addition, if certain populations are less willing to share their data with governments and the private sector for research purposes, their insights are not likely to be included in the design process of government programs and private-sector goods and services. This distrust can be addressed by explaining the benefits of data-sharing, the privacy protections they will offer (such as protections against sharing and selling data), and how they will secure the data shared to prevent external hacks.

Another challenge is ensuring that all populations are able to use the latest innovations that come from data access. For example, fully self-driving cars could leave people with low and moderate incomes behind if they are priced too expensively. In addition, they may be left without opportunities to rent vehicles at a lower price point. Likewise, people with disabilities who use mobility devices, such as wheelchairs, might not be able to use self-driving cars if they do not incorporate universal design Features that allow products to be used by people of many ages and abilities. In the context of housing, universal design may include widened doors, accessible kitchens and bathrooms, and other architecturally friendly features that help people remain independent during different life… principles into the design process.

Preventing discriminatory outcomes data misuse: Another concern is that the algorithmsThe set of instructions used to analyze data. They are used to reveal patterns, trends, and associations and make inferences and predictions. that power AI may be infused with bias. This could lead to discriminatory outcomes. For example, an algorithm that used per-capita income to decide where to build grocery stores could lead to a lack of fresh food in low-income communities. This might unintentionally create food deserts. Food deserts are communities with limited access to affordable and nutritious food. Intentional action can ensure fairness, transparency, and accountability in algorithmic decision-making.

Policymakers and the private sector also have a key role to play in preventing the misuse of data. For example, the proliferation of data might allow predatory lenders, such as those offering payday loan products, to better target consumers who are having difficulty making ends meet (see also Alternative Financial Services). The high fees and annual percentage rates of these loans could make consumers worse off. Similarly, increased surveillance, combined with advancements in facial-recognition technology, can create consumer privacy risks.

Moreover, consumers may not be aware that their behavioral data, such as online tracking information, can be combined with other data to create detailed profiles. Those profiles are then sold to companies that want to target individual consumers. This could result in price discrimination. Customers could be charged different prices based on their willingness to pay for a good or service. Some might pay substantially more because their browsing habits indicated a higher willingness to pay.

Safeguarding data privacy and security: Another challenge is ensuring data privacy and security protections while continuing to allow positive uses of data. Uses that can spur innovation bring lasting consumer benefits. Privacy refers to the rights that consumers have (or do not have) to control their PII. It addresses who or what entity can collect, store, share, sell, and analyze a particular consumer’s PII. The strongest privacy protections allow consumers to control their personal information and decide how it may be used. The proliferation of data can result in a lack of privacy protections, particularly with free products require users to give up their privacy to use them. This may limit options for people with low and moderate incomes who cannot pay for more expensive products and services that protect privacy. This pay-for-privacy model means that privacy is a commodity that can be bought and sold, rather than a fundamental right.

Security refers to the need to safeguard data from unauthorized hacks. The information age has made data more valuable to fraudsters. The need to protect the security of PII has never been higher.

Found in Consumer Data Issues