Personalization vs privacy – Where do we draw the line?

Creating a personalized experience for users is a worthwhile but extremely delicate undertaking for news publishers. Users will tolerate the necessary invasion of privacy only if they perceive the experience as a true enhancement, argues guest poster Nikolay Malyarov.


by WAN-IFRA Staff | September 13, 2017

Data, big or small, has been a hot topic in publishing and marketing circles for a number of years now. It is often heralded as the secret to what separates the wheat from the chaff in the race to revenue. But like all instruments of power, data must be handled with tender loving care when it involves human beings.

Recently I started thinking about how data is being used (and abused) by publishers attempting to create more personalized experiences for consumers – the Holy Grail for differentiating brands.

But as much as most people like the idea of personally-relevant information (editorial and advertorial) at their fingertips, when that data actually starts showing up in all the wrong places, many start having a creepy feeling that Big Brother is watching them.

The slippery slope of our perishing privacy

John Perry Barlow, former lyricist for the Grateful Dead and co-founder of the Electronic Frontier Foundation, said, “Relying on the government to protect your privacy is like asking a peeping Tom to install your window blinds.”

When I first read that, I immediately thought, “There goes another conspiracy theorist,” but Barlow wasn’t far from the truth. Privacy today seems to be as fleeting as a Snapchat photo.

Look at the recent case of smart meters being installed in US homes. Being able to analyse smart meter data has many potential benefits, such as giving consumers insights into their energy use. Not only does this help them understand the optimal times to use energy throughout the day, minimizing use during peak hours increases reliability and can save them money.

But, while recognizing that this is unchartered territory, smart meters can also reveal private details on what’s happening in the house. The meter data shows when you’re at home, away, sleeping or even showering – patterns that are highly valued by nefarious home invaders.

A lawsuit was filed to protest that metered data should be within the bounds of the Fourth Amendment to the Constitution of the United States of America – a bill of rights that protects people from unreasonable searches and seizures.

The motion was ruled against by a federal district court which stated that that Americans can’t reasonably expect any privacy when it comes to the data collected by these devices. The case is under appeal, but this issue has far-reaching implications.

The rapid growth of the Internet of [hackable] Things (IoT) is making our privacy and security even more susceptible to abuse, at least at this time.

For example, last October in New Hampshire, vulnerabilities were exploited in connected devices around the world to create a massive distributed denial of service (DDoS) attack that brought down some 1,000 websites including Twitter, Spotify, Netflix, Amazon, Tumblr, Reddit and PayPal.

In February 2017, a smart doll called Cayla was banned in Germany by its Federal Network Agency (Bundesnetzagentur) because the doll can record and transmit people’s conversations without their knowledge, enabling use of that data to advertise directly to the child or parents. Does the toy maker actually abuse the power of that data? I don’t know, but many concerns certainly exist with other voice-controlled assistants such as Apple’s Siri, Microsoft’s Cortana, Amazon’s Echo and Google Home. Just imagine what hackers could do if they got a hold of all that Intel. It’s not just creepy – it’s downright scary.

And let’s not forget the Samsung and its “allegedly” hackable TVs.

Hitting too close to home

Too many times lately, I’ve been talking to friends about a specific subject/product only to open my Facebook account to see ads for it in my newsfeed. I know, I know, there’s no empirical evidence that proves that the social titan is guilty of unscrupulous surveillance, but with a history full of duplicitous behaviour, one can’t help but wonder…

  • Remember back in 2009 Facebook promised that our personal information was private, when in fact, it was sharing it with others – an abuse that resulted in charges from Federal Trade Commission?
  • Then there was the time that the Electronic Privacy Information Centre (EPIC) filed a motion with the FTC accusing Facebook of deceptive trade practices and violation of a 2012 Consent Order when it was caught manipulating newsfeeds of some 700,000 users.
  • Just last summer, Facebook flip-flopped on whether it was truly generating friend recommendations based purely on the location of people’s smartphones. Then it was found mining people’s cell phone numbers – an abuse of privacy that ended up having patients of a psychiatrist seeing each other in their “People You May Know” boxes on Facebook.
  • Then to top it all off, in October 2016, Facebook made it impossible for us to hide our profiles from complete strangers. Facebook also agreed late last year to settle a class-action lawsuit over allegations that it scanned private messages between users.

I could keep going on, but I’m not here to bash Facebook (although they do leave themselves wide open to it); search engines and other social sites are also working behind the scenes with our data.

The balance of power between personalization and privacy is a tenuous one; and if the state of Mark Zuckerberg’s Mac is any indication, he’s not taking any chances either with prying digital stalkers.

The two sides of personalization

It feels like we are living in an era of digital hyper-personalization. According to Accenture’s 2016 Personalization Pulse Check, 70% of consumers are generally comfortable with news sites collecting personal data if the publisher is transparent about how it uses it; 75% were comfortable if they could personally control how it was being used.

What’s interesting is that 68% of respondents said they were highly satisfied with the use of their personal data by Netflix and Hulu because it helps them find shows that they like, despite the fact that there is little transparency or user control of the data.

This inconsistency seems to indicate that there is a clear line between invasive personalization tactics and helpful ones.

Invasive personalization

We are bombarded with content every day, most of which is a pure waste of bits and bytes. From alternative facts and propaganda to intrusive, bandwidth-hungry advertising, it’s no wonder that publishers’ attempts at personalization have become a major contributor to the rise of ad blocking use.

Dwelling on this internet infestation here isn’t going to solve the problem, but it’s worth pointing out again that it’s not the users who are to blame for the US$41.4 billion loss in revenues. The blame lies on the shoulders of publishers who continue to abuse visitors with an abysmal advertising experience.

But this dead horse is not worth kicking anymore. Instead, let’s talk about those businesses that understand what quality personalization is all about.

Helpful personalization

If one were to stop a stranger on the street and ask them who they think offers the best online personalized service, I would bet that Amazon, Netflix and Spotify would be near the top of the list. All have found a way to delight customers through creative use of their personal data without them really understanding what’s happening under the hood.

In terms of Netflix, it all comes down to their highly intelligent recommendation system – a combination of algorithms focused on engaging and retaining the interest of viewers. It collects vast amounts of data that describes what each member watches and how (device type, time of day and week, intensity of watching, etc.), where each video was discovered, and even what recommendations were suggested but not viewed.

Spotify adopted a Facebook-like newsfeed form of personalized and frequently updated playlists called Discover Weekly – an algorithm that analyses a person’s listening history, combines it with what’s new and hot on Spotify, and updates it each week with 30 new songs it thinks users will like. Last year it took personalization to another level with Radar – an algorithmically-personalized weekly playlist of newly released songs from artists each user already likes.

This under-the-radar (excuse the pun) data crunching flies in the face of transparency, but the fact is that both companies’ use of personal data is in the best interest of the users. It’s not out to try and sell them something, but rather to increase engagement by improving their viewing and listening experience. That experience is worth the price of admission – i.e. access to their personal viewing/listening preferences.

Customer first – quality always

If there was anything you should take away from this article it’s that customer-first innovation must be your number one priority, regardless of your industry or target audience. Because, at the end of day, all that matters is the consumer and their experience with your brand – an experience that must get better and smarter the more they interact with it.

Sign photo by Sashataylor (Own work) [CC BY-SA 3.0 (], via Wikimedia Commons.

Nikolay Malyarov is Chief Content Officer and General Counsel of PressReader – an all-you-can-read digital content platform with a growing list of over 5,000 newspapers and magazines. Known for his thought leadership and vision, Nikolay offers provocative commentary on future trends in publishing at major international events. He frequently contributes to prominent global publications, exposing what’s between the lines on topics such as audience development and user experience, content distribution and monetisation strategies. He holds a Master of Arts and Juris Doctor degrees from the University of British Columbia. His views are his own and do not necessarily reflect the opinion of WAN-IFRA.

Share via
Copy link