Policy change to accommodate changes in existing users' preferences
is gaining momentum and is expected to increase at a rapid pace.
TL;DR → Digital platforms often change their features and policies to accommodate changes in preference (privacy preference, for instance) of users. This post documents changing preferences shape the goals and policies of platforms by focusing on one type of platform: the social media platforms.
Table of Content
Page 1: Change is Right in Front of You (thanks, digital technologies)
Page 2: Change in Preference about Data Collection → Change in Data Collection?
Page 3: Change in consumer preferences →Change in monetization?
Page 4: Consumers do not always WANT what they ask for
Page 5: What to expect in the next few years?
Year 2012:
Year 2015:
Photo Courtesy: Twitter profile of Cardale Jones
Page 1: Change is Right in Front of You (thanks, digital technologies)
“The measure of intelligence is the ability to change” -Albert Einstein
Cardale Jones’ 140-character rant on his studies is quite literally the textbook example for bad social media practices. At the time, Jones was suspended for a game, and apologized on television. Within the world of Twitter, though, Jones apologized only three years later, while preparing to enter the NFL Draft.
Perhaps the greatest transformation we’ve seen in social media over the last decade is its evolution from a quaint place to interact with your close friends into a cosmopolitan planet of its own where you could conceivably interact with anyone, anywhere on Earth. In 2010, when social media was still shiny and new, users were all about efficiency and convenience. Pew Research Center highlighted the role of social media at the time, highlighting its convenience and efficiency. Users valued how quickly they could send messages to their friends, how good a search engine was at predicting the content they wanted to see, and delivering information at an instant. Convenience was king, and everything else took a back seat.
At the time, consumers were unaware of the reality that tech-savvy engineers and scientists could form their digital persona from their seemingly-scattered online activities. They believed that accessing their “sensitive” information without consent was not possible. A survey by Pew Research Center in 2010 found that only 40% believed that the internet has access to the name of their employer and a photo of them, and other categories were much below this margin.
Fast forward to today and it seems clear that most users never imagined the possibility of their posts, their public spats, and their reckless comments coming back to haunt them in the same decade (just as Kevin Hart discovered in 2018).
Page 2: Change in Preference about Data Collection → Change in Data Collection?
"Slowness to change usually means fear of the new." — Philip Crosby
In reality, data mining has existed as long as these social media sites have, and it's only gotten stronger with the advent of smartphones. As most of us know already, apps and websites gather information like our browsing history, financial details, shopping preferences etc. through web cookies, GPS locations and device identifiers (read more in this blog).
The original data collection policies were designed keeping consumer convenience in mind. Most consumers hardly wanted to opt-out but even if they did, it was not easy (read impossible). Add to it, a rich dimension of locational data because of GPS chips on smartphones. This explosion in “cheap” personal data marked an explosion in the prevalence of data brokers. Consumers agreed that the benefits from sharing the data with everyone far outweighed the costs. But things have changed since the Cambridge Analytica scandal.
Consumers are now aware of intrusive data tracking, with 90% of people thinking it is important to control who gets their information. Most of them harbor a deep rooted distrust in the corporations they interact with, with 76% of adults do not believe that corporations will keep their information confidential (survey results here). As a consequence, digital platforms are increasingly taking roles of designing policy changes to satisfy consumers’ desires. For example, Apple (with the launch of iOS 7 in 2012) and Google (with the launch of Android 6.0 in 2015) changed the app privacy policy from “download time” to “runtime”, meaning that consumers choose what permissions to grant the apps during runtime instead of giving all permissions during download.
Policy changes due to change in preference from convenience to control, without a change in monetization opportunities for platform complementors, may have unintended consequences (as explored in one of my working papers here).
Page 3: Change in consumer preferences →Change in monetization?
“People don't resist change. They resist being changed!” - Peter Senge
Change in consumers’ preferences raises an important issue for suppliers of digital products and services: how can they make money? Platforms, having imposed new restrictions on data collection, are aggressively seeking alternative monetization methods for suppliers on the platform. Since consumers are used to getting free products and services, one might wonder if platforms will struggle to monetize. Guess what, plenty of research has now shown that consumers may be more willing to pay for services than previously thought.
For instance, 44% of people globally pay for entertainment and a whopping 70% are willing to do so in the future. Another study uncovered how much participants are willing to pay for apps that are currently free. WhatsApp stood at 89% of participants, with consumers willing to pay on average $2.38/month. Youtube at 72% of participants making an average $4.20/month. Adopting a subscription model would see apps’ subscription revenue rising by over 100% (with extreme outliers of 10,771% for Reddit and 1,928% for YouTube). A clever research conducted by researchers at MIT measured this by asking consumers’ willingness to be compensated for staying off a platform (read more about their research here). All these suggest that paid options seem to be a legitimate alternative to digital advertising.
Consumer support for paid apps is not just a hypothetical: research on existing monetization models uphold this idea. Companies in the video game industry have successfully moved away from one-time disc purchases and towards subscription models, expansion packs, and downloadable content, which has resulted in recurring payments from loyal customers. Someone who plays 20+ hours a week spends an average of $112.80 over 3 months on the game (link here). Additionally, a study from Forbes found that users don’t spend much time these days deciding to pay for a paid app if the cost is medium or lower (where low is $6.99 and less, medium is $7-$20, and high is $20-$50).
The transformation appears to be a logical outcome: the provision of free products and services attracted users who benefited from them. The influx of more consumers gave credibility to these digital offerings. Now that they have become legitimate, consumers are ready to pay for them. Especially when they realize that paying with their data can be costly in the long run.
Page 4: Consumers do not always WANT what they ask for
“If you try to fail, and succeed, which have you done?” ― George Carlin
Platforms are expected to play a balancing game: how much to listen to their demand side before criticisms start flowing from the supply side. For example, consumers seem torn between their desire for a personalized online experience and their need for increased privacy. For this reason, making changes to platform policy becomes an incredibly arduous process.
Consumer’s desire for a more personalized online experience directly contradicts their desire for increased privacy online. According to AdWeek, 87% of global consumers believe it is crucial to restrict their information flow, yet 48% want personalized experiences that are only possible through complex data tracking. Consumers enjoy the personalized recommendations that retail marketers provide by using AI to track everything: from our on-site behavior to our location and demographics, but the same consumers find it creepy that AI makes preference based decisions. For this reason, Forbes points out that consumers can be hypocritical: they dislike companies for invading their privacy but are frustrated when their online experience is impersonal. It's easy to see why consumers are conflicted, as personalization used to be less invasive. Hence, they want to control more invasive practices. However, it is also possible that their demand for control may be driven by their belief that “every click and scroll” will be used to price-discriminate.
Now, some of these policy changes may have negative consequences to the same consumers who demanded them. A recent decision by Meta to fully integrate message encryption by 2023 has many concerned for consumer’s internet safety. The National Society for the Prevention of Cruelty to Children (NSPCC) explains that direct messaging on social media is the number one cause of sexual abuse to children online, and encryption prevents both the platform and law enforcement from seeing messages.
This conflict is further complicated by the privacy paradox. Consumers say they value privacy, but their actions often indicate otherwise. This creates a challenging situation for platforms: they must balance the consumers’ “stated” needs while also ensuring their safety and security online on the one hand, and firms' ability to continue data driven innovation while also ensuring that the sensitive data is not misused.
Page 5: What to expect in the next few years?
Many platforms will continue to actively integrate new policies to cater to their existing audience. Facebook founder and CEO Mark Zuckerberg said he would consider a version in which the user could pay to remove ads from the site. While Facebook has only proposed this idea, Google has put it into action. Google has launched the “Google Contributor” program where users can pay Google to remove ads on the website. This not only challenges ad blockers that harm Google’s revenue, but also satisfies some users who prefer Google that way.
However, not all platform policy changes go smoothly. For instance, Twitter’s initial attempt to democratize “blue tick mark” was a disaster. Similarly, when WhatsApp announced a new policy stating users can no longer opt out of sharing their data with Facebook, WhatsApp’s parent company, it caused many to lose trust on WhatsApp and many left the app. Eyebrows were raised when Instagram recently changed their policy to a minimum of 30 minutes, taking away user choice.
The outcome of these policy changes highlights the fact that we must constantly measure their impact thoroughly in order to find approaches that best benefit the consumer and the company. Without recognizing the impact of policy changes, platforms will be stuck dealing with the contradictory nature of user preferences. Consumers will not hesitate to abandon the sites in favor of those that they can trust to accommodate them.
Conclusion
Jones’ tweet might really have been better suited for a text message to a group chat of friends than to the entire Twitterverse – but in early social media days, there really wasn’t supposed to be much of a distinction between the two. However, if you might meet your friends, your family, and your future employer all over the course of one digital day, how might you change your behavior? How might your preferences change? Such shifts in consumer behavior and preferences are already underway across most digital platforms. Consequently, the policy changes.
I am deeply grateful for the research and writing contributions from Amanda Mooney and Chitra Marti for this post.