By Dr. Ricardo Baeza-Yates, NTENT
In a world where 4 billion users across the globe access the Internet through some type of device, the issue of user privacy, meaning the extent to which our user data is being shared, has become a prominent topic.
As much as people enjoy the ease of ordering products online or the joy of sharing photos and voicing opinions on social media, they also expect a degree of autonomy throughout the process. Unfortunately, users don’t always understand the connection between a personalized Internet experience and the information required to make that happen. A Deloitte survey of 2,000 U.S. consumers revealed that 91% of respondents consented to Terms & Conditions disclosures without actually reading them.
As the web continuously evolves, so does the expectations of its users. People may want intuitive products capable of offering personalized experiences, but that comes with a certain level of cost to user privacy. In the end, people still get to decide how much of themselves they choose to expose to the Internet. This article helps users explore ways they can maximize their web experience without throwing caution to the wind.
As magical as the Internet feels when it appears to sense your thoughts before you have them, it’s not magic at all. It’s the result of perpetual data collection and analysis happening with every stroke of the key. If you appreciate seeing ads on the sites you visit tailored to meet your needs, or you light up when your newsfeed shows updates with your topics of interest, then be advised; a certain amount of your data is necessary to make that happen.
As with any other personal relationship, the more digitally intimate you wish to get, the more risk ensues. However most of the time personalization can happen at the device or provider level to ensure a positive user experience while maintaining an adequate level of security. Comprehending the risk to reward ratio can help you decide how you wish to proceed.
In the following four scenarios, we correlate user experience to the extent it endangers privacy, and identify whether privacy guidelines are set at the provider and/or device level. The term “provider” refers to any Internet or mobile service provider where software runs in a data center. “Device” refers to software on the device itself, where the level of privacy is determined through a combination of user settings and the device Operating System (OS).
When you access the Internet using your smartphone, your overall privacy level depends on default/custom privacy settings of the OS on your phone, plus the default/custom settings of the app or service you’re using at any given moment. For example, when you open the Facebook app on your Apple iPhone, your privacy level is determined by the iOS settings of the actual handset, the privacy settings configured within the Facebook app, including those you set for your specific profile, and those set up by the company on the back end. In the same respect, to use a more simple app like GPS, you need both the OS and the app, set up to share the GPS in order to access it.
For each scenario below, assume the privacy settings at the OS and app level in your device are at the same level or less restrictive than the ones needed to allow the service to track you.
1- Zero privacy risk
The Duckduckgo search engine is an example of Zero Privacy Risk at the provider level. The device settings are irrelevant since Duckduckgo makes no effort to track your behavior. In exchange for complete privacy, the ranking of search results is not personalized due to lack of user profiling.
Contextualization defines an experience where user information is customized according to context, such as location, time, direction, etc. Privacy is set at the provider level and is limited to only the details necessary to determine the current context, offering a low threshold of risk. Many providers do this when a user opts to use a web page or app without logging in. According to the terms of service, without a valid login, the page or app cannot profit from a user model that tracks personal information and historical behavior; partially because the service cannot confirm your identity, but it can try to infer it.
An example of contextualization would be a search for movies or restaurants “nearby” so you don’t end up with results of eateries or theaters way outside your reach. An alternative scenario might occur if you browse the app or website of your favorite retail store without actually logging in to an account. Instead of seeing results based on your previous purchases, style preferences or saved sizes, you might be limited to inventory or sales information specific to your local store.
Rather than call out a specific user or group of users, companies often build user personas that categorize individuals into a classification system designed to serve a set purpose. These may consist of a range of demographics (age, income, gender, location, etc.) that enables them to address the needs of their target audience by focusing on common issues among members. This benefits users who seek out results or offers suited to their general needs personally, without having to give up too much direct individual data.
Since your interaction data is used to infer demographic information and assign you a persona, without you directly entering specific information, privacy here is moderately preserved at the provider level. Statistics pertinent to the company’s goal at hand may be collected but nothing distinguishes one person from the next in the same persona, similar to k-anonymity.
Persona scenarios provide the minimum amount of information required for businesses to implement meaningful user-level targeted advertising.
4- User profiling
User profiling results in a highly personalized experience because access is granted to follow the user at the device level allowing for the collection of data via cookies, browser history, location, or anything trackable. You’re exposed to results and offers pinpointed to not only meet your needs, but to anticipate them.
For example, if you regularly use your Starbucks app to order coffee, you may notice ads related to coffee or Starbucks promotions that you never searched for, start appearing where and when you don’t expect it. Though this type of customized attention can be helpful, it leaves a digital footprint with the highest susceptibility to risk. To use your Starbucks app for all that it offers, you must consent to be logged in, with payment details, order history and favorites readily available for Starbucks to retrieve.
The balance between personalization and privacy
While various measures can be taken to limit exposure to private user information, the fact remains that each time someone logs on, downloads an app, pays for something online, uploads a picture, likes a friends’ post, shares a news story or performs any digital activity, a bit of traceable you is left behind. Perhaps in the future the privacy settings will be automatically negotiated between the OS and the service in such a way that privacy becomes a useful trade-off.
Before you hastily accept the next set of Terms & Conditions without reading the fine print, or agree to log on to a new app with your social media profile, practice your own due diligence. You may have to settle for websites cluttered by irrelevant ads or decline a tempting rewards program to keep your privacy safe, but you’ll be able to sleep better at night knowing your personal details remain somewhat personal.
Your Internet experience is your choice. You don’t have to completely sacrifice privacy for personalization, but the give-to-get factor is still highly significant.