| Literature DB >> 35281634 |
Sara Quach1, Park Thaichon1, Kelly D Martin2, Scott Weaven1, Robert W Palmatier3.
Abstract
Driven by data proliferation, digital technologies have transformed the marketing landscape. In parallel, significant privacy concerns have shaken consumer-firm relationships, prompting changes in both regulatory interventions and people's own privacy-protective behaviors. With a comprehensive analysis of digital technologies and data strategy informed by structuration theory and privacy literature, the authors consider privacy tensions as the product of firm-consumer interactions, facilitated by digital technologies. This perspective in turn implies distinct consumer, regulatory, and firm responses related to data protection. By consolidating various perspectives, the authors propose three tenets and seven propositions, supported by interview insights from senior managers and consumer informants, that create a foundation for understanding the digital technology implications for firm performance in contexts marked by growing privacy worries and legal ramifications. On the basis of this conceptual framework, they also propose a data strategy typology across two main strategic functions of digital technologies: data monetization and data sharing. The result is four distinct types of firms, which engage in disparate behaviors in the broader ecosystem pertaining to privacy issues. This article also provides directions for research, according to a synthesis of findings from both academic and practical perspectives.Entities:
Keywords: Artificial intelligence; Big data; Data monetization; Data sharing; Digital technology; Internet of things; Privacy; Privacy regulation; Social media; Structuration theory
Year: 2022 PMID: 35281634 PMCID: PMC8897618 DOI: 10.1007/s11747-022-00845-y
Source DB: PubMed Journal: J Acad Mark Sci ISSN: 0092-0703
Digital technology tensions and consumer privacy risks
| • Social media rely on user-generated content, and consumers voluntarily share substantial personal information and other useful insights through these technology platforms. Data collected from social media might be shared with partners, such as members of the business network, for better market insights and data-based innovation. | • • • Data might be sold to third parties, such as advertisers. | • Being unable to control the flow of information. • Third parties’ access to profile information and user-generated content from well-developed application programming interfaces. | • Organizations might be able to reach consumers through location disclosures, such as tagging a venue in their posts on social media. | • Risk of exposing information of close ties; firms might intercept and exploit the exchange between two connected contacts. |
| • Data and location insights might be shared with partners, such as members of the business network, for better market insights and data-based innovation. | • • • Data might be sold to third parties such as advertisers. | • Confidentiality of accumulated location data, disclosing both travel history and real-time position of an individual. | • Organizations are able to pinpoint the exact locations of users and reach them. • Signaling surveillance. | |
| • Data might be shared with partners, such as members of the business network, for better market insights and data-based innovation. | • • • Biometrics data might be sold to third parties that use them for various purposes, such as product development. | • Lack of control over the use of highly sensitive and immutable information, which can reveal a person’s identity. • Objectification of emotions and manipulation. | • Biometric data are vulnerable to hacking and coveted by cybercriminals, which increases the potential for identity theft, stalking, and disruption to personal lives. | |
| • Data might be shared with partners, such as members of the business network, for better market insights and data-based innovation. | • • • Information may be readily sold, so external firms can exploit deep knowledge of consumer browsing behavior. | • An extensive profile of customers can be built by tracking their visits to multiple websites, which defies anonymity. • Information might be shared with third parties. • These technologies are often hidden and hard to detect or delete. | • Individuals can be followed by using their digital footprints. | |
| • Access to real time data through connected devices. | • • • Data might be sold to third parties | • Sensitive information may be collected and shared in real-time among different IoT-enabled systems and devices. • Lack of control over data access and exchange, especially in machine-to-machine interactions. • Smart devices are very vulnerable to cyberattacks. | • Firms or third parties might reach customers using IoT-enabled devices and systems without being noticed, such as with CCTV cameras that track people using facial recognition technology. | • IoT-enabled devices and systems can capture and transmit communications between users, such as when integrated microphones capture conversations. • The IoT devices seize data from not just users but also proximal others. |
| • Insights and analytics might be shared with partners, such as members of a business network. | • • • Insights and analytics might be sold to third parties. | • Identifiable information and highly sensitive personal attributes such as sexual orientation, age, and political views may be collected. • Algorithmic profiling and aggregation leads to a comprehensive picture of an individual. • Unauthorized access and lack of control over the accumulated information. | • Risk of stolen identity, violation of personal spaces, and loss of intellectual property. • Being subject to sophisticated manipulation using predictive analytics. • Potential discrimination from customer profiling, which increases individual vulnerability. | • Private communications might be captured from different data points using data mining tools. |
| • Access to data, applications, and services by multiple users in real time; data storage at reduced technology costs. | • • • Data and analytics might be sold to third parties. | • High risk of unauthorized access due to virtualization and remote processing and storage, especially during the transmission of data across different platforms. • Data leakage often results in significant data losses. • Risk of information exposure to external groups such as fourth parties. • Cloud service providers are often private firms, raising questions about data access, control, availability, and backup. | • Firms or third parties might be able to track customers using real-time data stored in cloud services. | • Private communications in cloud storage might be intercepted. |
| • Enabling automated sharing of real-time data. | • • • Insights and analytics might be sold to third parties. | • It has become very easy and inexpensive to identify, profile, and manipulate consumers without their consent. • Enormous amounts of data are required to train AI, often unnoticed by customers. • AI has the ability to predict sensitive data based on seemingly harmless pieces of information. | • Information may be used to produced fake content (e.g., deep fakes) to manipulate customers or reach them instantly. | • Advanced AI agents can interact with users and make sense of the conversations between them. |
| • Enabling automated sharing of real-time data, some of which might be from physical interactions. | • • • Data and analytics might be sold to third parties. | • Robots’ autonomy means humans have less control over their data. • Third parties’ management and usage of personal information may change after multiple iterations of data. | • Potential intrusion into physical and emotional space due to physical and personal contact with robots. | •Robots equipped with computer vision and machine learning see and sense the environment; can analyze human characteristics e.g. age, gender, emotions; and can make sense of humans’ conversations. |
| • Access to data through connected realities; visualizations and data storytelling can be shared quickly and seamlessly across groups of users. | • • • Insights and analytics might be sold to third parties. | • Sensitive, real-time information and private communication can be captured by input devices. • Both output and input devices can communicate wirelessly, resulting in a lack of control over the collected information. | • Physical space might be captured, such as by spatial mapping of information when people engage in mixed or augmented reality, including bystanders. For example, social AR in public spaces likely captures passers-bys’ facial and behavioral data, without them noticing. • Output data might be exposed to other parties and manipulated to deceive users, such as in clickjacking practices. | • Personal communications can be captured by devices such as cameras and microphones. |
Privacy responses among consumers, regulators, and firms
| Reactive | Proactive | ||
|---|---|---|---|
| Consumer data protection behavior | • • • | • • • | |
• • • | • • • • | ||
| Data privacy regulation | • | • | |
• • | • Obtain consumers’ • Provide consumers • | ||
| Firm responses | • | • | |
• • Improve • Automated and standardized procedures to facilitate the | • • • • |
Fig. 1Integrated framework of privacy structuration
Integrated data strategy framework and case studies
| Company and Sources | Tenets and themes | Data Strategy | Effect on Firm Performance |
|---|---|---|---|
Patterson ( | Data monetization; Data sharing; Privacy regulation; Privacy risks; Customer privacy protection behavior; Firm performance | Facebook extensively monetizes user data through extended data wrapping, such that it provides data analytics-based features to its clients (e.g., advertisers), for example, targeted advertising based on users’ activity, and measuring the ad effectiveness by tracking users’ digital footprints. Facebook shares substantial data with partners such as app developers; it had allowed third-party apps to access data on Facebook users’ friends for years, which led to an infamous scandal in which Cambridge Analytica acquired data on millions of customers to build comprehensive personality profiles without their knowledge in 2018. | Data monetization fuels Facebook’s profitability. In 2018, the value of Facebook users’ personal information was equal to $35.2 billion, or 63% of Facebook’s revenues. However, Facebook has come under scrutiny due to its data practices. After the Cambridge Analytica scandal, Facebook was fined US$5 billion by the Federal Trade Commission, and £500,000 by the UK’s Information Commissioner’s Office for their role in the scandal. The event sparked heated debates about consumers’ privacy rights, prompting policy makers to increase the stringency of data regulations. The privacy scandal resulted in a decrease in overall trust in the company, falling daily active user counts in Europe, and stagnating growth in the US and Canada. |
Apple ( | Data monetization; Proactive privacy responses; Privacy risks; Firm performance | Apple uses digital technologies to gather and make sense of data for internal monetization purposes, such as optimizing marketing and business performance, developing prediction analytics to improve user experiences, and innovating new products and services. Apple also engages in data wrapping, such as through the Apple Health App, which tracks users’ physical activities and biometrics and create alerts if health issues arise. Apple shares data with partners such as suppliers and app developers. Apple has adopted privacy-by-design principles and used enormous digital resources to develop privacy innovations, such as Intelligent Tracking Prevention in Safari, Privacy Labels on the App Store, and App Tracking Transparency. | Apple performs exceptionally; its revenues soared by 54% to $89.6 billion in the first quarter of 2021. While engaging in monetization practices, privacy initiatives have reduced the perceived risks of using Apple products and positively influenced customer responses. More than two-thirds of Apple customers agree with its privacy policies and 92.6% of Apple users stating they would never switch to an Android. The App Tracking Transparency privacy innovation encourages advertisers to use Apple’s own Search Ad in the App Store, further strengthening the impact of data monetization on firm performance. This data privacy innovation thus is changing industry norms, shaping new customer privacy behaviors, and reinforcing existing data regulations. |
BMW ( | Data sharing; network effectiveness; firm performance; data privacy regulation; proactive privacy responses | BMW has engaged extensively in data sharing and but imposed strict limits on how those data can be used for monetization. To detect and rectify product defects, it is essential for its partners and suppliers to obtain data assigned to a specific vehicle, on a case-by-case basis. BMW has adopted innovative privacy approaches, including pseudonymization to encode personal information, that establishes smooth procedures while preventing other parties from tracking customers. In 2020 BMW and automotive manufacturers and suppliers, dealer associations and equipment suppliers joined a data-sharing alliance to build a cloud-based data exchange platform. | Data sharing enhances the effectiveness of the business network, which improves BMW’s performance. It can proactively monitor product functions, increase value chain efficiency, and enhance customer experiences. Data sharing enables BMW and its suppliers, to pinpoint production bottlenecks or parts shortages, which can boost in-network effectiveness and the performance of all firms involved. The new cloud technology is designed with privacy and security in mind, allowing European car manufacturers to maintain control over their own data. This initiative helps them formulate effective responses to potential scenarios, such as the coronavirus lockdown that imposed serious pressures on the supply chain. |
Research agenda for data strategies
| Theme | Brief Description | Research Questions |
|---|---|---|
| As data become the new currency in the digital era, the firms that can create unique and sought-after data and business intelligence from data-generating technologies wield increasing power. Firms need to maximize value from data by creating balanced, responsible data monetization and data sharing. | • What data valuation models can motivate responsible data monetization and risk minimization? • How do changing work cultures, such as increasing uses of home networks, personal and shared computers, and access to a wide range of systems from outside the office, increase the threat of cyberattacks or data breaches? • Can data sharing be improved via data interoperability? | |
| The benefits of data privacy innovations require further investigation. In particular, additional research is needed to identify effective data privacy innovations that might enhance the outcomes of data sharing or data monetization, as well as the challenges to the adoption and implementation of privacy innovations. | • Will firms’ enhanced data privacy practices become the new standard, such that innovating firms need to be even more novel in their privacy practices? • How is the relationship between innovative firms and digital technology providers likely to evolve? • What is the sustainable level of investment in privacy innovation for data harvesters, data informants, data patrons, and data experts? • Do the benefits of privacy innovation justify the costs of adoption and implementation? | |
| According to structuration theory, structure is both the context and product of actor behavior. In other words, firm responses to privacy regulations can shape the regulatory framework, which in turn constrain or promote subsequent firm actions. The impact of regulations may vary among firms with different characteristics such as data strategy and firmographic variables. | • What are the risks of firms failing to adhere to reactive and proactive privacy requirements? How do differences in regulations moderate these effects? • To what extent do firm responses to regulatory frameworks change the regulation itself, which in turn might trigger a different set of firm actions? • How does the data strategy type moderate the effect of privacy regulation on firm performance? • How do firmographic variables moderate the effect of privacy regulation (stringency) on firm performance? | |
| Privacy regulations may be increasingly necessary, but also threaten the benefits that firms and consumers receive from data monetizing and sharing. That is, greater limits clearly may be warranted, but more stringent regulations can create unintended obstacles to value co-creation in a firm’s network or broader ecosystem, especially when there may be fragmentation in legislative regimes. | • What is the impact of regulation stringency on network performance? Does this effect vary with industry settings? • How do policy makers balance consumer privacy protection with industry innovation? • Do the effects of privacy regulation on firms that control flows of data, such as data experts and data patrons, spill over onto firms that are reliant on external data, such as data harvesters? • How do interactions of sector-specific regimes influence ecosystem performance? | |
| As digital technologies penetrate consumers’ lives and new technologies become powerful means for firms’ data collection and use, it is important to understand consumers’ attitudes toward privacy in response to firm and regulatory actions. | • What are consumers’ attitudes toward firms implementing privacy innovations and firms using a specific data strategy (i.e., data harvesters, informants, patrons, and experts)? • Do new technologies, such as super AI, threaten consumer information, individual, and communication privacy? • How do consumers respond to regulation stringency? • What can be done to make data trade-offs more acceptable for customers? | |
| In response to emerging threats to privacy, consumers employ various forms of protection, both reactive and proactive. Understanding when these responses are triggered can help firms devise effective strategies. | • What are the outcomes of consumer privacy responses, and how do they translate into financial impacts on firms? • When do consumers activate reactive and proactive responses? • What role do privacy-enhancing technologies play in consumers’ privacy responses? | |