@ Practising Law Institute
6
Mobile Privacy
Growth in the use of “smart” mobile devices has had a dra-
matic effect on the way companies interact with their consum-
ers and employees. Through mobile devices such as smart-
phones, tablets, and smart watches, companies have access
to consumers and employees wherever they go and when-
ever they go there. The majority of that access occurs through
mobile applications (or “mobile apps”), programs developed
specifically for use on mobile devices that offer a service or
provide information to anyone who downloads them. With
mobile devices capable of storing a variety of individuals’
most personal information and of tracking individuals’ pre-
cise location, companies often come into close contact with
personally identifiable information.
Government authorities and privacy advocates have become
concerned about the privacy implications of such access to
personal information. Lacking a single, omnibus law on pri-
vacy in mobile apps, companies are left to contend with a
patchwork of state laws and regulatory best practice recom-
mendations. This chapter synthesizes the legal and regulatory
considerations facing a company either developing its first
mobile app or hoping to bring its existing mobile apps in line
with best practices.
6–1
@ Practising Law Institute
Privacy Law Answer Book 2021
As in many areas of privacy, notice and consent are the cor-
nerstones of compliant mobile app practices. In the context of
mobile apps, these cornerstones must be considered when:
(1) developing privacy policies specific to the mobile app that
describe the collection and use of personal information;
(2) implementing just-in-time disclosures that provide special
notice to consumers at the very point when the apps access
certain information; and
(3) using and sharing personal information with third parties.
The Basics...................................................................................6–3
Definitions.......................................................................6–3
Specific Privacy Concerns.................................................6–5
Personal Information and Other Data..................................6–5
Regulatory Framework..................................................................6–7
Statutory Requirements and Best Practices............................6–7
Agency and Industry Guidance..........................................6–9
Compliance: Privacy by Design........................................6–13
Mobile App Privacy Policies........................................................6–16
Generally......................................................................6–16
Policy Terms, Disclosures.................................................6–17
Posting Requirements......................................................6–19
Short Form Notices.........................................................6–19
Just-In-Time Disclosures and User Consent......................................6–21
Requirements.................................................................6–21
Implementation and Compliance......................................6–23
Sharing PII with Third Parties.......................................................6–26
Generally......................................................................6–26
Requirements.................................................................6–26
“Frictionless” Sharing......................................................6–28
Retention of PII...............................................................6–29
6–2
@ Practising Law Institute
Mobile Privacy Q 6.2
The Basics
Definitions
Q 6.1 What is mobile privacy?
Mobile privacy refers to the privacy rights of users of mobile
devices, such as smartphones, tablets, and smart watches, as differ-
entiated from the privacy rights of users of websites on computers.
This chapter focuses primarily on the nature of the rights implicated
during the collection and tracking of personally identifiable informa-
tion (PII) through mobile applications, and the use and disclosure of
that information to third parties.
Q 6.2 What are mobile applications?
Mobile applications—commonly referred to as “mobile apps”—are
software programs for mobile device operating systems (such as iOS
or Android),1 which are distributed through mobile app stores (such
as Apple’s App Store or Android’s Google Play). Many companies
develop mobile apps to offer mobile services to their customers or to
promote their brand. Other companies interact with their customers
primarily through mobile apps (for example, Venmo’s mobile payment
service, Uber’s mobile car service, TikTok’s video-sharing social net-
work). Mobile apps can be available for free or for a fee, and they can
be the method through which users access other fee-based services.
PRACTICE TIP
Aside from mobile apps, many companies offer a mobile-specific
version of their websites. The mobile version of the website
should comply with the obligations related to the collection, use,
and sharing of personal information described in chapter 2.
6–3
@ Practising Law Institute
Q 6.3 Privacy Law Answer Book 2021
Q 6.3 Who are the relevant players in the
mobile ecosystem?
There are several players in the mobile ecosystem, including:
• mobile app developers, which develop, market, and distrib-
ute the mobile apps—a company that puts out an app is con-
sidered the app developer regardless of whether it wrote the
app’s code itself or engaged outside software developers to
actually write the code for the app;
• app platforms, such as Apple’s iOS and Google’s Android,
which enable app developers to distribute their apps through
app stores;
• mobile carriers, which provide wireless communication
services;
• manufacturers of mobile devices, which design and market
the mobile devices onto which apps may be downloaded;
• mobile ad networks or analytics providers, which serve ads
within apps and may collect and analyze users’ PII across
multiple mobile apps;2 and
• other third parties, such as social media platforms that allow
users to log in to their social media accounts through their
mobile apps and that may share information between the
mobile app and the user’s social media accounts.3
All of these parties are capable of accessing PII provided or created
through a user’s interaction with a mobile device, including through
apps. Regulators have made it clear that the app developers—the
companies that oversee the design, development, marketing, and dis-
tribution of the mobile app—are responsible for providing user pri-
vacy protections such as notice and choice.
The focus of this chapter is on the app developers’ privacy obliga-
tions, including how to design privacy into an app and how to inform
and empower mobile app users to exercise control over the use and
disclosure of their PII.
6–4
@ Practising Law Institute
Mobile Privacy Q 6.5
Specific Privacy Concerns
Q 6.4 What privacy concerns does the use of
mobile apps raise?
Mobile apps raise unique privacy concerns:
• Unlike PCs and laptops, mobile devices are portable and
commonly travel with their owner outside the home, permit-
ting companies to track the owner’s location. Among other
things, location information and travel patterns can be used
to develop a profile of the user’s interests for targeted market-
ing purposes.
• Mobile devices typically have many different sensors, includ-
ing a microphone, camera, and keypad. The data from those
sensors, especially when combined with data stored on the
device such as SMS messages and contacts, can create an
extremely detailed portrait of the owner.
• Mobile devices have small screens, making it more challeng-
ing for mobile applications to effectively notify and inform
users about the PII that they will access and potentially share
with third parties.
Personal Information and Other Data
Q 6.5 Does any PII have a special definition in the
mobile ecosystem?
In the context of mobile apps, PII refers to “individually identifi-
able information about an individual consumer” that a developer col-
lects and stores in an accessible form.4 While the same types of PII
discussed in chapter 1 (see Q 1.2) are relevant to user privacy rights
in the mobile ecosystem, certain forms of PII deserve special note in
the context of mobile apps, namely:
• geolocation information: mobile devices, which are often
traveling with individuals, allow mobile apps to track precise
user movements and locations;
6–5
@ Practising Law Institute
Q 6.6 Privacy Law Answer Book 2021
• friends and contacts: mobile apps may have access to users’
contact information, including email addresses and phone
numbers of the users’ families, friends, colleagues, and
acquaintances;
• photographs, video and audio recordings: mobile apps may
have access to the pictures and videos taken by the built-in
cameras or audio recordings captured by the microphone in
mobile devices.
Q 6.6 What types of persistent identifiers are
significant in the mobile ecosystem?
In addition to the more general persistent identifiers relevant
across privacy law (see Q 1.5), the mobile ecosystem includes per-
sistent identifiers deserving special attention, including:
• identifiers provided by app platforms, such as the Apple Iden-
tifier for Advertisers (IFA) and Apple Identifier for Vendors
(IFV). The IFA and IFV behave like online cookies and can be
reset by the user to avoid permanently linking any data with
a specific user. Further, the IFA provides an opt-out mecha-
nism to allow users to avoid behavioral targeting, and the
IFV is deleted when a user deletes all apps from a specific
developer.5 Google’s Android platform also employs a similar
advertising ID with the same functionality as the Apple Iden-
tifier for Advertisers;6
• a unique device identifier (UDID), which is a hardware ID that
is permanently associated with only Apple mobile devices,
such as the iPhone and iPad. Unlike cookies, which can be
cleared to erase a specific user’s actions, a UDID cannot be
deleted.7 In 2013, Apple began rejecting apps that referenced
mobile devices’ UDIDs, causing many app developers and
advertisers to instead rely on the IFA.8 Similarly, Android
employed a unique Android ID, which has more recently given
way to the advertiser ID;9
• a media access control (MAC) address, which is a unique
identifier for identifying any piece of hardware on a net-
work, whether mobile or non-mobile. On mobile devices, this
6–6
@ Practising Law Institute
Mobile Privacy Q 6.7
“hardware address” enables advertisers to track an individ-
ual phone as it moves across network connections. Apple
introduced a feature to randomize the MAC address in the
latest version of its mobile operating system, but this feature
currently functions only when Wi-Fi services are turned off,
which is impractical for most users;10 and
• international mobile equipment identity (IMEI), which is
essentially an electronic serial number. In some countries,
including the United States, IMEIs are used to blacklist devices
that have been identified as stolen.11 This identifier is unique
to mobile phones.
Regulatory Framework
Statutory Requirements and Best Practices
Q 6.7 What is the U.S. legal framework governing
mobile information privacy?
An app developer’s information practices, like those of a website
operator, are subject to various federal and state privacy laws depend-
ing on the type of institution that the app developer is and the catego-
ries of PII that it collects. For example, if the app developer is a bank
that collects financial information through its app, the privacy provi-
sions of GLBA would apply (see chapter 4); if the developer is a health-
care provider that collects individually identifiable health informa-
tion, the privacy provisions of HIPAA apply (see chapter 5). Similarly,
if the app collects information from children under age thirteen, the
app developer must comply with COPPA (see chapter 3). Additionally,
to the extent the contents of communications—as opposed to record
data such as a telephone call’s origination, length and time—are cap-
tured, this may implicate the Wiretap Act.12 Finally—while outside
the scope of this book—federal and state laws address cybersecurity
issues raised by the collection of PII through mobile apps. Most nota-
bly, the California Consumer Privacy Act (CCPA), which took effect
on January 1, 2020, has implications far beyond mobile information
privacy. The law provides California consumers with broad rights to
access, erase, and limit the sale of their personal information, broadly
6–7
@ Practising Law Institute
Q 6.7 Privacy Law Answer Book 2021
defined. Covered businesses will have significant new obligations to
disclose their privacy practices, limit the use and sale of personal
data, and respond to consumers seeking to enforce their new rights.13
In addition to laws addressing privacy practices related to specific
types of data, app developers must understand general privacy prac-
tices required by federal and state authorities. Each state also has its
own statute effectively prohibiting “unfair” and “deceptive” practices
occurring in trade, commerce, or consumer transactions. In addition,
the California Online Privacy Protection Act (CalOPPA) applies to
mobile apps as well as websites and requires both to “conspicuously
post” privacy policies and to make specific disclosures.14 Other than
that, little federal or state law directly addresses the data collection
practices of online services such as mobile apps.
Importantly, the FTC has initiated enforcement actions against
app developers for privacy violations pursuant to its authority under
section 5 of the FTC Act, which prohibits companies from engaging in
“unfair or deceptive acts or practices.”15
*
CASE STUDY: Uber Technologies, Inc.
In November 2017, the Attorney General of Washington State
filed suit against the ride-sharing technology company Uber for
violating the state’s data breach notification law. Uber allegedly
failed to provide timely notice of a data breach that occurred in
November 2016. At the time, the company paid the hackers to
destroy the data and then failed to report the breach until a year
later. The Attorney General claimed that each day that Uber failed
to provide notice of the breach to each of the 10,888 Washington-
based drivers and the Attorney General’s office constitutes a
separate violation. Washington’s breach notification law pro-
vided penalties of up to $2,000 per violation.
6–8
@ Practising Law Institute
Mobile Privacy Q 6.7.1
The lawsuit did not cover Uber passengers because Washington’s
breach notification law only requires notification where an indi-
vidual’s name is disclosed in combination with other sensitive
data (e.g., financial account numbers, driver’s license number, or
Social Security number). There was no indication that such data
were exposed with respect to passengers.16
Relatedly, the FTC expanded a prior settlement with Uber based
on the company’s failure to disclose the November 2016 breach.
In August 2017, the FTC imposed a settlement against Uber for
failing to deliver on its data security promises. The FTC alleged
that Uber violated section 5 of the FTC Act by claiming that it
was “closely” monitoring and auditing its employees’ access to
rider and driver data on an ongoing basis, when in reality the
ride-hailing service “rarely” monitored internal access to this
personal data. The August 2017 settlement subjected Uber to
third-party audits for twenty years and forced the company to
adopt a rigorous privacy-protection program.17 The FTC revised
its complaint and proposed order in April 2018 after Uber dis-
closed the November 2016 breach.18 The revised order requires
the company to submit to the FTC all of the reports from the
required third-party audits as well as to disclose certain infor-
mation regarding rewards Uber offers to individuals who report
vulnerabilities in its software.19
In addition to the action in Washington and the FTC settlement,
Uber faced actions in several other states as well as federal class
actions arising out of the breach.20
Agency and Industry Guidance
Q 6.7.1 What formal guidance exists for app developers?
The FTC and the Office of the Attorney General of the State of
California (“California AG”) have each issued specific guidance on
mobile privacy, describing the best practices that they encourage and
expect app platform providers and app developers to follow. These
6–9
@ Practising Law Institute
Q 6.7.1 Privacy Law Answer Book 2021
documents currently serve as the best available guidance for mobile
app privacy practices and provide a roadmap for regulators investi-
gating mobile app information practices. The guidance includes:
• FTC 2013 Report. This FTC staff report, entitled “Mobile Pri-
vacy Disclosures: Building Trust Through Transparency,” was
issued in February 2013 and provides recommendations for
best practices on mobile privacy disclosures;21 and
• Privacy on the Go: Recommendations for the Mobile Ecosystem.
This set of privacy practice recommendations (the “California
2013 Guidance”) was published in January 2013 by the
California Department of Justice to assist app developers
and others in considering privacy early in the development
process.22
These reports, discussed in detail throughout this chapter, encour-
age an app developer to:
• make clear disclosures on how the mobile app collects, uses,
and shares PII through a privacy policy specific to and readily
accessible from the mobile app;
• provide “just-in-time” disclosures and obtain users’ affirma-
tive express consent when collecting “sensitive information”
(see Q 1.3) or making “unexpected uses” of PII; and
• minimize PII collection and storage to only that information
necessary to support the mobile app’s basic functionality.
“Just-in-time” disclosures are discussed below at Q 6.15; “unex-
pected uses” of PII are discussed below at Q 6.15.1.
In addition, in February 2018, the FTC released additional guidance
on security updates to mobile devices and applications. While out-
side the scope of this book, companies offering mobile apps or making
mobile devices should review this guidance to ensure that they offer
regular, transparent, and effective security updates to consumers.23
Best practices for mobile privacy also have been issued by sev-
eral self-regulatory initiatives.24 While this guidance does not come
directly from regulators, the FTC 2013 Report approvingly cites the rec-
ommendations made through such initiatives.25 Thus, app developers
6–10
@ Practising Law Institute
Mobile Privacy Q 6.8
should review the recommendations made by these initiatives,
including:
• NTIA Short Form Notice Code of Conduct. This guidance from
the National Telecommunications and Information Admin-
istration (NTIA) describes best practices for providing
notice of data collection to mobile app users via concise,
readable disclosures known as “Short Form Notices”26 (see
QQ 6.14–6.14.2).
• FPF/CDT Best Practices. These app privacy guidelines issued
by the Future of Privacy Forum and the Center for Democ-
racy & Technology provide a set of recommendations for
mobile app developers on data collection, focusing on what
to collect, how to use it, and third parties with which it can be
shared.27
• Lookout Mobile App Advertising Guidelines. This guidance,
issued in 2012 by Lookout, a cybersecurity company, seeks to
establish a set of standards and guidelines for transparency
and clarity in data collection practices, individual control
over the personal data collected, and the secure transfer of
sensitive data.28
The CDT Best Practices and the Lookout Mobile App Advertising
Guidelines reinforce the guidance offered by the FTC on the collection
and storage of PII, as well as the disclosures that should be made to
consumers. The NTIA Code of Conduct, discussed more fully below,
provides specific guidance on Short Form Notices, which provide the
readable and understandable mobile disclosure encouraged by the
FTC.
Q 6.8 Are there additional privacy obligations on a
company if its mobile app collects payment
information?
App developers may offer a mobile app experience requiring no
exchange of money or may engage in a monetary transaction with the
app’s users. Such a transaction can take the form of a fee charged to
download the mobile app, or a purchase—made after the user has
downloaded the app—of additional app features or of a product or
6–11
@ Practising Law Institute
Q 6.9 Privacy Law Answer Book 2021
service. These transactions and payments are typically processed
through the app store platforms themselves.
The FTC, after holding a workshop on mobile payment process-
ing,29 issued a mobile payment staff report in 2013 identifying three
key areas of concern in the use of mobile payments:
(1) dispute resolutions with users regarding possible fraudulent
or unauthorized payments;
(2) data security of payment card information; and
(3) privacy.30
The issues of dispute resolution and data security are outside the
scope of this book, but the same privacy considerations surrounding
mobile apps’ collection and use of PII generally—including transpar-
ency, privacy by design, and user consent—apply to mobile apps’ pro-
cessing of payments.
Q 6.9 What restrictions exist on the ability to
market mobile apps, or a company’s
products and services more generally, via
a mobile device?
As briefly mentioned in chapter 1 (see Q 1.22), the Telephone
Consumer Protection Act of 1991 (TCPA) restricts the ability of com-
panies to leverage automated telemarketing in the mobile ecosystem.31
In particular, the TCPA prohibits companies from using automatic tele-
phone dialing systems (ATDSs) and prerecorded voice technology to
market themselves via telephones absent the express written consent
of consumers. Companies cannot obtain the express written consent
to receive such messages by making that consent a condition of pur-
chasing property, goods, or services, although at least one circuit has
deemed consent valid where that consent was written into a leasing
agreement.32 In addition, one court has held that the requirement for
prior express consent is met where the consumer makes an online
offer to sell, accompanied by a phone number and instructions for
interested buyers to contact the number for more information without
any specific call limitations.33 The restriction applies to calls to mobile
phone numbers as well as text messages.34
6–12
@ Practising Law Institute
Mobile Privacy Q 6.10
Notably, companies cannot make any telemarketing calls or send
telemarketing text messages to numbers on the national do-not-call
registry (NDNCR), even absent the use of an ATDS or prerecorded
voice technology.35 The FCC also mandates that companies maintain
an internal list of numbers whose owners have requested not to be
called, which would override any express written consent previously
provided by the consumer.36
*
CASE STUDY: Golan v. Free Eats.com, Inc.37
In July 2019, the U.S. Court of Appeals for the Eighth Circuit
affirmed a district court’s award of $32 million to a plaintiff class
seeking damages under the TCPA. The plaintiffs brought suit after
the defendants undertook a telephone marketing campaign, in
which they made about 3.2 million phone calls, to advertise a
new film. The plaintiffs sought the statutory maximum under the
TCPA of $500 per call ($1.6 billion total). After a jury verdict
in the plaintiffs’ favor, the district court found a statutory dam-
ages award of $1.6 billion violated the Due Process Clause and
reduced the award to $32 million ($10 per call).
Compliance: Privacy by Design
Q 6.10 What steps should a company take, as a
mobile app developer, to ensure that its
apps are compliant with privacy law and
best practices?
Both the FTC and the California AG encourage app developers to
adopt a “privacy-by-design” approach (see Q 1.13), factoring in pri-
vacy as a key consideration from the beginning of the development
process.38 Privacy by design should be addressed in a proactive man-
ner from the beginning of the app development process—not layered
on later.
6–13
@ Practising Law Institute
Q 6.10.1 Privacy Law Answer Book 2021
The FTC 2013 Report and the CA 2013 Guidance advise that app
developers also:
• have a privacy policy that is clear and conspicuously accessi-
ble to users; and
• provide “just-in-time” disclosures and obtain affirmative
express consent before collecting and sharing sensitive infor-
mation or using information in ways that may be unexpected
from the user’s perspective.
Q 6.10.1 How should a company implement the FTC’s
recommendation of privacy by design in mobile
app development?
In the specific context of mobile apps, “privacy by design” means
that companies must:
• understand exactly how the mobile app will function;
• identify what PII the mobile app really needs for its basic
functions;
• list any sensitive information that the mobile app may collect
or access;
• identify any third parties that will collect app data or run app
analytics, and understand what information the third party
may collect and how it will be used; and
• test the app prior to release so that its data collection and
usage functionality can be assessed in practice.
With this information, a company can make key decisions about
necessary disclosures and user consent.
6–14
@ Practising Law Institute
Mobile Privacy Q 6.10.1
*
CASE STUDY: Coronavirus Contact-Tracing Apps
In 2020, with the outbreak of the coronavirus, countries around
the world implemented lockdowns that had a major impact on
economic and social activity. Many countries and businesses
required nonessential workers to work from home. The mounting
economic and social costs of the health crisis have increased the
urgency to find ways to get more people back to work safely.
A significant part of that effort involves the development of con-
tact-tracing apps for mobile phones.
Countries such as China, Israel, Singapore, and South Korea have
all had varying degrees of success in using electronic contact
tracing (along with widespread virus testing and manual contact
tracing conducted by health professionals) to allow a significant
portion of their population to return to work, albeit with limita-
tions.39 At press time, Google and Apple are working on a joint
effort to create a contact-tracing tool.40
Coronavirus contact-tracing apps generally involve the collection
of health and/or location data as well as, in some cases, the shar-
ing or disclosure of COVID-19 (the disease caused by the corona-
virus) testing information to users, government agencies, and/or
healthcare professionals. The apps may also disclose to users that
they have been in contact with infected persons. As such, these
apps raise a host of privacy questions.
Businesses that develop or use contact-tracing apps should care-
fully consider existing legal requirements, such as those discussed
in this chapter as well as in chapter 5 regarding health informa-
tion, chapter 7 regarding workplace privacy, chapter 3 regarding
children’s privacy (if the app is intended to be used by minors),
and chapter 9 regarding the California Consumer Privacy Act (if
the app is used by California residents). In addition, businesses
should consider the following:
6–15
@ Practising Law Institute
Q 6.11 Privacy Law Answer Book 2021
Consent: business should obtain consent from users in con-
nection with the collection, use, and disclosure of their
information.
Privacy policies and disclosures: privacy policies and disclo-
sures should clearly describe what information is collected,
how it is used, how it is stored, and how and with whom it is
shared.
Data minimization measures: the business should consider
collecting, using, and storing data for only as long as necessary
for contact-tracing purposes.
Data security measures: what security measures (e.g., encryp-
tion, centralized vs. decentralized data storage) will be in
place to protect the information collected by the app.
Mobile App Privacy Policies
Generally
Q 6.11 If a company already has an online privacy
policy, is it necessary to have a separate
privacy policy for its mobile apps?
A company should have either a separate privacy policy within
the app addressing its mobile information practices, or mobile-
specific provisions in the company’s general online privacy policy with a
cross-reference from the app. Notably, the California AG has pursued
enforcement action where a mobile app lacked an app-specific privacy
policy and there was doubt as to whether the corresponding website’s
privacy policy captured the privacy practices of the app.41
In addition, in February 2012, the leading mobile app platforms and
the California AG issued a “Joint Statement of Principles” emphasiz-
ing the legal requirement under CalOPPA to “conspicuously post” a
privacy policy, and setting forth practices that mobile app platforms
should employ to ensure apps comply with their stated practices.42
6–16
@ Practising Law Institute
Mobile Privacy Q 6.12
This Joint Statement of Principles was cited with approval in the FTC
2013 Report.43 The principles include that, as part of the approval pro-
cess for apps, the mobile app platforms will include an optional field
through which app developers can submit either (1) a hyperlink to
the mobile app’s privacy policy or (2) the actual text of the privacy
policy.44 Many of the leading platform providers have gone further
than simply adding an optional field for privacy policy submission,
instead including in their app distribution agreements a requirement
that developers provide legally adequate privacy disclosures.45
Policy Terms, Disclosures
Q 6.12 What terms should a company include in its
mobile app privacy policy?
An app developer’s mobile app privacy policy should describe
its information practices, just as its online privacy policy does (see
QQ 2.9–2.11). The CA 2013 Guidance sets forth specific recommenda-
tions on the types of information that app developers should include
in mobile app privacy policies.46 That guidance, which reflects prevail-
ing best practices, recommends disclosure of at least the following
information in a mobile app privacy policy:
• the types (for example, a username) or categories (for exam-
ple, any unique device identifier) of PII collected by the app;
• the uses and retention period for each type or category of PII;
• whether payment information for in-app purchases is col-
lected by the mobile app or a third party;
• the types of third parties with which the app may share PII;
• the choices that a user has regarding the collection, use,
and sharing of PII, and instructions on how to exercise those
choices;
• the process for a user to review and request corrections to PII
maintained by the app;
• a way for users to contact the app developer; and
6–17
@ Practising Law Institute
Q 6.12 Privacy Law Answer Book 2021
• the effective date of the privacy policy and the process for
notifying users of material changes to it.47
In addition, the privacy policy should include any disclosures
required by other applicable privacy laws (for example, COPPA, if the
app is directed to children under age thirteen).
Apps can make a surprising amount of data publicly available, as
highlighted by a high-profile incident involving a fitness app. In the
summer of 2020, as viral videos of various kinds helped fuel cries
for racial justice, one video surfaced that showed a cyclist attacking
Black Lives Matter sympathizers on a bicycle trail. Social media users
quickly started trying to match user profile pictures from a certain
fitness app with a video of the attack released by law enforcement.
Amateur detectives thought, incorrectly, that they had pinpointed
two specific cyclists as the likely attacker. False accusations directed
at the two men by name rapidly spread across Twitter. The two falsely
accused individuals eventually cleared their names, in one case by
persuading the authorities to issue a rare statement that he was not
a suspect. But the exonerations came only after a torrent of harsh
online attacks and threats. Social media metrics showed that the accu-
sations had spread much farther and faster than the exonerations.
The misidentification could be said to have stemmed indirectly
from the privacy settings on a fitness app used by riders of the bicycle
trail. The app allowed users to create profiles with their photographs,
ride histories, and other personal details. The app’s privacy controls
defaulted to “Everyone”—meaning that unless a user affirmatively
chose a more restrictive setting, any member of the public could view
a user’s profile from within the app or as a search engine result.
The Twitter “detectives” made their false accusations by matching
ride histories and profile pictures from the app against a video of the
attack circulated by law enforcement. Based on their public profiles in
the app, the Twitter users accused the men because they appeared to
have ridden the same trail, in the same general time period, and bore
a general resemblance to the attacker in the video.
The app in question (called Strava) was quite forthright about the
fact that a user’s settings defaulted to the least restrictive, most pub-
lic option. It shared this information on its website and in its privacy
6–18
@ Practising Law Institute
Mobile Privacy Q 6.14
policy, and in its FAQs. Users could change the privacy settings from
within the app or when logged in on a web browser. It is a cautionary
tale about how a well-intentioned, well-disclosed effort to create an
online community (in this case, of fitness enthusiasts) can have unin-
tended consequences.48
Posting Requirements
Q 6.13 Where should a company post its mobile
app privacy policy?
CalOPPA requires app developers to “conspicuously post” their
privacy policies, which for the purposes of a mobile app means the
policy must be “reasonably accessible” for users.49 Similarly, the FTC
2013 Report states that app developers should make their privacy
policies “easily accessible” through the app stores’ download page.50
The FTC in particular emphasizes the importance of having the pol-
icy available both before users download and install the mobile app
and in a permanent location to which current users can refer back.51
Additionally, as discussed more fully in chapter 3, COPPA requires
online services directed to children under age thirteen to link to the
privacy policy on the home page of the mobile app itself (see Q 3.25).
In order to comply with these FTC recommendations, app develop-
ers should ensure the privacy policy is accessible in two places. First,
the policy should be posted or linked to on the app stores’ download
pages, to make it available to potential users before they download
the app. Second, the policy should be posted or linked to within the
mobile app itself, for example, on the “home” page of the app or on the
controls/settings page.
Short Form Notices
Q 6.14 Is a company also required to provide a
“Short Form Notice” of its information
practices?
Short Form Notices are voluntary. However, it is a best practice to
include a Short Form Notice highlighting the key information practices
disclosed in the company’s full policy. In 2012, the U.S. Department of
6–19
@ Practising Law Institute
Q 6.14.1 Privacy Law Answer Book 2021
Commerce, through its National Telecommunications and Information
Administration, convened a multi-stakeholder process on applica-
tion transparency in order to develop some guidance on Short Form
Notices, with the goal of increasing transparency of data collection
and sharing on mobile devices.52 The resulting “NTIA Short Form
Notice Code of Conduct”53 describes best practices for Short Form
Notices.
Q 6.14.1 What format should a Short Form Notice take?
Given the small screen size of mobile devices, a Short Form Notice
should be included on either the home page of the mobile app or on
the page where users’ information will be collected—though this is
not a requirement (see Q 2.5). The Short Form Notice should be writ-
ten in plain and concise language.54 The notice may be supplemented
with icons that users may click on for more information—for exam-
ple, a “location pin” or globe icon that provides a short description
of what location data the mobile app collects and how it uses that
data, or a trash can icon that explains how collected information is
deleted.55 The NTIA Short Form Notice Code of Conduct also contains
specific guidance regarding design elements used in the presentation
of short form notices (use of icons, font sizes, delivery methods, etc.).56
Q 6.14.2 What terms should be included in a
Short Form Notice?
There is no standard set of terms for Short Form Notices. However,
the NTIA Short Form Notice Code of Conduct today remains the best
source of guidance on these types of disclosures.57 The FTC 2013
Report indicated that the FTC will look favorably upon adherence to
such codes of conduct during enforcement actions and specifically
encouraged app developer trade associations and privacy research-
ers to develop standard short form disclosures.58 The NTIA Short
Form Notice Code of Conduct recommends including in a Short Form
Notice the following disclosures:
• the type of data collected by the app;
• how users can access the app’s “long-form” privacy policy;
• whether any PII is shared with third parties; and
• the identity of the app developer.59
6–20
@ Practising Law Institute
Mobile Privacy Q 6.15.1
Just-In-Time Disclosures and User Consent
Requirements
Q 6.15 When should a mobile application use
just-in-time disclosures?
A “just-in-time” disclosure, also referred to as an “enhanced” or
“special” notice, is a notice displayed to a user just before that user
takes an action that will cause information about him or her to be
collected.60
In both the FTC and the California AG’s 2013 guidance, the regula-
tors called for the use of just-in-time disclosures in situations when
sensitive information is collected from mobile app users, or when PII
is collected for unexpected uses (uses not related to the app’s func-
tionality).61 Both regulators also expect that companies will obtain
affirmative express consent from a user at least the first time that they
collect information in those contexts.62 To the extent that a mobile
app must comply with the “direct notice” requirements of COPPA (see
QQ 3.24–3.26), these just-in-time disclosures provide a means through
which app developers can satisfy their COPPA obligations.63
Notably, the regulations the California AG published pursuant to
the CCPA reiterate the obligation for mobile app developers to provide
just-in-time disclosures when a business collects personal information
for a purpose that the consumer would not reasonably expect.64 The
just-in-time disclosure must provide a summary of the categories of
personal information the app is collecting and provide a link to the
mobile app’s full privacy notice.
Q 6.15.1 What is an “unexpected use” of PII?
The collection, use, or disclosure of PII that is not required for an
app’s basic functionality is generally considered an “unexpected use.”
Common examples of unexpected uses of information include:
• accessing contacts if not required for the app’s basic
functionality;
• sharing data with ad networks for behavioral advertising
purposes;
6–21
@ Practising Law Institute
Q 6.15.1 Privacy Law Answer Book 2021
• sharing data with third parties for their own purposes, or
for purposes unrelated to the app’s basic functionality (for
example, to combine data with data from other sources to use
across the app developer’s different websites or apps);
• accessing privacy-sensitive device features, such as a camera,
dialer, microphone, or geolocation information;
• accessing photos and videos if not required for the app’s
basic functionality; and
• accessing text messages or call logs.65
The FTC has specifically warned app developers to disclose the
use of certain code that allows smartphones to detect audio signals
from a TV program, so that advertising on the phone can be custom-
ized based on the phone owner’s viewing habits.66
PRACTICE TIP
It is best practice for developers to use a just-in-time disclosure
and obtain the user’s affirmative consent when accessing sensi-
tive information, even where this access is expected based on
the app’s function. For instance, a mobile app designed to locate
restaurants in a user’s vicinity might use a just-in-time disclosure
and obtain consent to access a user’s geolocation. Even though
the use of precise geographic location is expected in this case, it
is considered sensitive information, and therefore a just-in- time
disclosure should be provided and affirmative express consent
obtained from the user. The just-in-time disclosure should also
make clear if the app will access geolocation information con-
stantly, and not only when the user uses the app (such as to locate
a restaurant in the vicinity).
6–22
@ Practising Law Institute
Mobile Privacy Q 6.17
Q 6.16 What terms should be included in a mobile
application’s just-in-time disclosure?
The just-in-time disclosure should explain the intended uses and
any third parties to which PII would be disclosed. The disclosure
should also provide an easy way for users to choose whether or not
to allow the collection or use of the information. If use of the app is
contingent on collection of the information, that fact should be made
clear when the user must make the choice. Also, whenever a change
is made to the mobile app’s privacy policy, a just-in-time disclosure
should be made informing the users that a change has been made.
Finally, the California AG recommends including in the just-in-time dis-
closure a link to the full privacy policy “if feasible.” Thus, while includ-
ing such a link is encouraged, the California AG recognizes that space
limitations may prevent the developer’s ability to do so.67
Implementation and Compliance
Q 6.17 What are the consequences of failing to
provide users with adequate notice before
collecting sensitive information or PII for
unexpected purposes?
An app developer that fails to provide users with adequate notice
before collecting sensitive information, or PII that will be used or
shared for unexpected purposes, may be subject to an enforcement
action. The FTC has taken action against developers by requiring,
among other things, the development of just-in-time disclosures sepa-
rate from the mobile app’s full privacy policy.
*
CASE STUDY: FTC v. Goldenshores Techs., LLC
In December 2013, the FTC pursued an enforcement action
against Goldenshores Technologies, LLC, alleging that its “Bright-
est Flashlight Free” app shared users’ PII with third parties without
6–23
@ Practising Law Institute
Q 6.18 Privacy Law Answer Book 2021
their consent.68 After installing the app, users were informed that
they would have the option of rejecting the company’s end user
license agreement, which included a provision stating that cer-
tain “information about your computer, system and application
software” would be shared with third parties, but the app began
collecting users’ geolocation information and persistent identifi-
ers before they were able to accept or reject the agreement.
The FTC approved a settlement with Goldenshores on March 31,
2014, under which the company was required to implement just-
in-time disclosures, to obtain users’ affirmative express consent
before transmitting any PII to third parties, and to delete all PII
collected from users who had downloaded the app before the
date of the consent order. Goldenshores was not required to pay
any monetary penalty.69
Q 6.18 How can an app developer make adequate
disclosures about the collection of sensitive
information such as a user’s geolocation?
As the Goldenshores enforcement action shows, the FTC gives par-
ticular scrutiny to the disclosures that app developers make in connec-
tion with the collection of geolocation information. To assist develop-
ers in properly navigating the requirements surrounding the collection
and use of geolocation information, the Cellular Telecommunications
Industry Association issued its own “Best Practices and Guidelines
for Location-Based Services,” modeled after the FTC’s information pri-
vacy best practices.70 In addition to the original notice provided when
first accessing geolocation information, these guidelines recommend
location-based disclosures in the following contexts:
• Developers should disclose for how long location information
will be retained, or if the retention period may vary depend-
ing on the circumstances;
6–24
@ Practising Law Institute
Mobile Privacy Q 6.19
• Developers should disclose what location information will be
shared with third parties and the types of third parties with
which this information will be shared;
• Even after obtaining consent initially, developers should dis-
close when location information will be used for a new or
material purpose different from that identified in the original
disclosure; and
• Developers should periodically remind users when location
information will be shared with others and of the privacy
options available to the user. The frequency of any reminders
will depend upon the nature of the use of location informa-
tion. For example, more reminders should be provided when
the app will frequently share location information with third
parties, whereas fewer reminders are necessary for one-time,
user-initiated instances of sharing.
The use of a special icon when location information is being
shared may be more effective than a written notice; for example,
Apple’s iOS platform uses a triangular icon next to the time on the
top of the screen to inform users when geolocation information is
being accessed.
Q 6.19 Can an app developer rely on just-in-time
disclosures provided by the app platform?
To an extent, yes. In order to promote clarity for users, the FTC
has advised against repeating disclosures contained in the platform’s
disclosures.71 The platform disclosure must precisely correspond to
the action taken by the mobile app. For example, if the platform dis-
closure indicates the mobile app will collect information on a user’s
religious affiliation, and the developer wishes to share that informa-
tion with third parties, an additional just-in-time disclosure is needed
before sharing that information.72
6–25
@ Practising Law Institute
Q 6.20 Privacy Law Answer Book 2021
Sharing PII with Third Parties
Generally
Q 6.20 May app developers share with third
parties the PII that they collect via their
mobile apps?
To an extent, yes. The FTC has recognized that PII collected via
mobile apps may be shared with third parties, including through code
developed by third parties and inserted directly into the mobile app’s
software. The FTC recommends that app developers carefully review
any third-party code inserted into their mobile apps in order to fully
understand the PII being collected and shared by third parties such
as advertisers and data analytics companies. Developers need this
understanding of third-party code in their apps in order to accurately
disclose to app users how their PII is being used.73
Requirements
Q 6.21 If an app developer shares with third
parties PII acquired through its mobile app,
what are its obligations to the app users?
First, the mobile app’s privacy policy must inform users that PII
will be shared with or accessed by third parties. The policy should
either name the third parties or identify them by category (for exam-
ple, analytics provider) and explain what information the third par-
ties will access and how such information will be used. If feasible, the
policy should include a link to the third party’s privacy policy, and if
the third party provides users with the option to opt out of the collec-
tion of their information, the policy should provide instructions for
opting out.74
Second, if the sharing of information with third parties would not
be expected by a user, or if sensitive information will be shared with
third parties, the mobile app should use a just-in-time disclosure and
obtain the user’s affirmative express consent before sharing the infor-
mation or allowing the third party to directly access the information
through the app.75
6–26
@ Practising Law Institute
Mobile Privacy Q 6.21
An app developer should ensure that the third parties’ practices
do not contradict its own privacy policies, and should obtain appro-
priate privacy and data security representations from such third par-
ties. This is particularly true with respect to children’s PII, as devel-
opers may be held strictly and vicariously liable under COPPA for
the practices of third parties collecting such information from the
developers’ mobile apps (see QQ 3.14–3.14.1).
The New York Attorney General has taken the position that even if
PII is provided to third parties in an aggregated, non-identifiable for-
mat, a disclosure must be made in the mobile app’s privacy policy
stating that PII is being transmitted to third parties if it is possible for
the third party to de-aggregate and re-identify individuals based on
the data tendered.76 The New York Attorney General has also taken the
position that when sharing its customers’ anonymized PII with coun-
terparties, a company should contractually forbid the counterparties
to de-anonymize the data—that is, to re-identify the individuals.
California’s Consumer Privacy Act requires businesses to disclose
or limit third-party access to consumer information in certain cases.77
*
CASE STUDY: In re BLU Products, Inc.
In April 2018, the FTC pursued an enforcement action against
BLU Products Inc., alleging that the mobile phone manufacturer
misled consumers by claiming that it limited third parties’ access
to user information to only information needed to perform the
requested services. A third party that provided security and oper-
ating system updates to BLU’s devices had access to far more
data than it needed to do its job. The third party could access
users’ text messages, real-time location data, call logs, contact
lists, and lists of applications installed on each device.78
The FTC reached a settlement with BLU on April 30, 2018, under
which the company was required to implement an information
security program and engage a third party to conduct data secu-
rity assessments for twenty years.79
6–27
@ Practising Law Institute
Q 6.22 Privacy Law Answer Book 2021
PRACTICE TIP
The level of disclosure required when a company like an ana-
lytics provider is collecting information depends on the precise
nature of the information collected and the use to which it is
being put. If the analytics provider is merely collecting PII to pro-
vide the app developer with usage analytics for the specific app,
that is an expected use of PII that does not require users’ affirma-
tive express consent. If the analytics provider is collecting such
information for other purposes (for example, to create a profile
of a user for behavioral advertising purposes), developers should
explain this practice in a just-in-time disclosure and obtain users’
affirmative express consent.
“Frictionless” Sharing
Q 6.22 What kind of disclosure must be made
to users if a developer’s mobile app is
integrated with social media platforms
to automatically share information on
users’ actions?
If a mobile app engages in the automatic (or “frictionless”) shar-
ing of information about users, the app should clearly inform users
through a just-in-time disclosure when that feature is enabled.
Frictionless sharing occurs when apps share information about users
or user actions in one app through a different mobile app or a social
network. For example, a user of the music streaming app Spotify might
have the specific song he or she is listening to automatically displayed
in his or her social network activity feed.
Additionally, while neither the FTC nor any state authority has
directly addressed frictionless sharing, app developers offering the
ability for users to engage in frictionless sharing should consider
making this mechanism an opt-in function, which Facebook itself now
6–28
@ Practising Law Institute
Mobile Privacy Q 6.23
encourages.80 Further, app developers must ensure that they comply
with automatic sharing delays imposed by any social media platform
through which the information will be shared. For example, in 2012,
Facebook required that a user interact with content for at least ten
seconds before that fact is disclosed to others.81
Retention of PII
Q 6.23 Are there limitations on an app developer’s
right to store the sensitive information
it collects?
Yes. Although retention of sensitive information may be neces-
sary for an app developer to tailor certain features for a better user
experience or to share information with third parties, retention also
increases the risks that the information will be leaked or misused.
Therefore, app developers should store only that sensitive data that
specifically identifies a user for the time necessary to operate the
mobile app.82
To best protect both users and the developers themselves, a data
retention policy should be implemented to ensure that sensitive infor-
mation is not stored indefinitely. In developing this policy, developers
should be mindful of any relevant terms in the mobile app platforms’
terms of service that directly apply to data retention.83 When delet-
ing data, developers should also clear associated metadata and ref-
erences to the deleted data. Absent this deletion, developers should
take steps to de-identify data so it cannot be linked back to a spe-
cific individual or user device.84 Law enforcement action in 2016 also
suggests legal authorities may require mobile app developers to limit
access to stored sensitive information to employees with a legitimate
business purpose for having such access.
6–29
@ Practising Law Institute
Q 6.23 Privacy Law Answer Book 2021
*
CASE STUDY: Uber
In January 2016, Uber reached a settlement with the Attorney
General of New York State over Uber’s privacy and cybersecurity
practices.
Uber provides a car service app that allows a user to hail drivers
under contract with Uber, who use their own car to pick up the
user at his or her exact location based on the geolocation infor-
mation transmitted by the user’s mobile device. While the mobile
app’s collection and limited use of geolocation information was
expected, the mobile app also leveraged that information to cre-
ate an aerial view of riders’ locations in Uber vehicles, along
with riders’ PII. This aerial view, known within the company as
the “God View,” was accessible to Uber employees to perform
real-time tasks such as assessing driver supply and demand in
certain areas.
During the course of the attorney general’s investigation, Uber
eliminated all PII from its aerial view. As part of the settlement
agreement, Uber agreed to restrict access to geolocation data to
only those employees with a legitimate business purpose who
receive authorized access through a formal process. Uber also
agreed to maintain a separate section in its privacy policy specif-
ically meant to describe Uber’s geolocation collection and use
practices.
6–30
@ Practising Law Institute
Mobile Privacy
Notes to Chapter 6
1. Future of Privacy Forum & Ctr. for Democracy & Tech., Best Practices
for Mobile Application Developers (July 12, 2012) [hereinafter FPF/CDT Best
Practices], www.cdt.org/files/pdfs/Best-Practices-Mobile-App-Developers.pdf.
2. Fed. Trade Comm’n, FTC Staff Report: Mobile Privacy Disclosures:
Building Trust Through Transparency 20 (Feb. 2013) [hereinafter FTC 2013 Report],
www.ftc.gov/sites/default/files/documents/reports/mobile-privacy-disclosures-
building-trust-through-transparency-federal-trade-commission-staff-report/
130201mobileprivacyreport.pdf.
3. FPF/CDT Best Practices, supra note 1, at 8.
4. California Online Privacy Protection Act (CalOPPA), Cal. Bus. & Prof.
Code § 22577.
5. See Apple’s IFA vs. IFV: When to Use Which and Why, Tune Help (Aug. 10,
2015), http://support.mobileapptracking.com/entries/22207575-Apple-s-IFA-vs-IFV-
When-To-Use-Which-and-Why.
6. See Advertising ID, Google Developer Console Help, https://support.
google.com/googleplay/android-developer/answer/6048248?hl=en (last visited
June 6, 2016).
7. Christopher G. Cwalina, Richard Raysman & Steven B. Roosa, Mobile App
Privacy: The Hidden Risks, Practice Note (Practical L. Co.), at 5.
8. Craig Palli, Why Mobile Advertising Is a Lot Better Off Without UDID,
VentureBeat (June 11, 2013), http://venturebeat.com/2013/06/11/why-mobile-
advertising-is-a-lot-better-off-without-udid/.
9. What Are Mobile Device Identifiers?, Aerserv Blog (Dec. 13, 2014), www.
aerserv.com/mobile-device-identifiers/.
10. Chris Burns, iOS 8 MAC “Randomgate”: Yes, You’re Still Trackable,
SlashGear (Sept. 26, 2014), www.slashgear.com/ios-8-mac-randomgate-yes-youre-
still-trackable-26348157/; Apple’s New Anonymity Feature for iPhones Is Flawed,
Researchers Say, RT (Oct. 2, 2014), http://rt.com/usa/192680-apple-mac-
random-flaw/.
11. Jennifer Valentino-Devries, Unique Phone ID Numbers Explained, WSJ
Blogs (Dec. 19, 2010), http://blogs.wsj.com/digits/2010/12/19/unique-phone-id-
numbers-explained/.
12. Compare In re Google Inc. Cookie Placement Consumer Privacy Litig., 806
F.3d 125 (3d Cir. 2015) with Vasil v. Kiip Inc., 15-cv-09937, 2018 U.S. Dist. LEXIS 35573
(N.D. Ill. Mar. 3, 2018).
13. California Consumer Privacy Act of 2018, A.B. 375, 2017–2018 Assemb.,
Reg. Sess. (Cal. 2018) (enacted) (to be codified at Cal. Civ. Code §§ 1798.100
et seq.).
14. See CalOPPA, Cal. Bus. & Prof. Code § 22575.
6–31
@ Practising Law Institute
Privacy Law Answer Book 2021
15. Section 5 of the FTC Act, 15 U.S.C. § 45.
16. Complaint, Washington v. Uber Techs., Inc., No. 17-2-30506-5 (Wash. King
Cty. Sup. Ct. Nov. 28, 2017).
17. Press Release, Fed. Trade Comm’n, Uber Settles FTC Allegations That It
Made Deceptive Privacy and Data Security Claims (Aug. 15, 2017).
18. Press Release, Fed. Trade Comm’n, Uber Agrees to Expanded Settlement
with FTC Related to Privacy, Security Claims (Apr. 12, 2018).
19. Revised Decision and Order, In re Uber Techs., Inc., File No. 152 3054
(F.T.C. proposed Apr. 11, 2018).
20. See, e.g., Commonwealth v. Uber Techs., Inc., Case ID 180300004 (Pa. Ct.
C.P. Mar. 5, 2018); In re Uber Techs., Inc. Data Sec. Breach Litig., MDL No. 2826, 2018
U.S. Dist. LEXIS 57817 (J.P.M.L. Apr. 4, 2018).
21. FTC 2013 Report, supra note 2.
22. Cal. Dep’t of Justice, Privacy on the Go: Recommendations for the
Mobile Ecosystem (Jan. 2013) [hereinafter California 2013 Guidance], http://oag.
ca.gov/sites/all/files/agweb/pdfs/privacy/privacy_on_the_go.pdf.
23. Fed. Trade Comm’n, Mobile Security Updates; Understanding the Issues
(Feb. 2018), https://www.ftc.gov/system/files/documents/reports/mobile-security-
updates-understanding-issues/mobile_security_updates_understanding_the_
issues_publication_final.pdf. The FTC and California AG remain focused on this
area, as shown by their enforcement actions and privacy reports to parties other
than developers in the mobile ecosystem, including consumers. Attorneys should
therefore be on the lookout for new guidance to mobile app developers on mobile
privacy.
24. FTC 2013 Report, supra note 2, at 12, n.58 (citing FPF/CDT Best Practices,
supra note 1); id. at 12 (citing the NTIA multi-stakeholder process, described infra
at Q 6.14).
25. Unlike the COPPA process in which self-regulatory agencies must submit
their guidelines to the FTC to be considered for safe harbor status, the mobile
guidance here does not receive a formal status. Rather, the FTC has merely men-
tioned in its reports that it “applauds” these efforts and cites specific ones with
approval.
26. Nat’l Telecomms. & Info. Admin., Short Form Notice Code of Conduct to
Promote Transparency in Mobile App Practices (July 25, 2013) (redline draft)
[hereinafter NTIA Short Form Notice Code of Conduct], www.ntia.doc.gov/files/
ntia/publications/july_25_code_draft.pdf.
27. FPF/CDT Best Practices, supra note 1.
28. Lookout, Mobile App Advertising Guidelines: A Framework for
Encouraging Innovation While Protecting User Privacy (June 2012), www.mylookout.
com/img/images/lookout-mobile-app-advertising-guidelines.pdf.
29. Fed. Trade Comm’n, FTC Staff Report: Paper, Plastic. . . or Mobile? An
FTC Workshop on Mobile Payments (Mar. 2013).
30. Id. at 4.
31. 47 U.S.C. § 227.
6–32
@ Practising Law Institute
Mobile Privacy
32. 47 C.F.R. § 64.1200(f)(8)(i)(B); Reyes v. Lincoln Auto. Fin. Servs., 2017 WL
2675363 (2d Cir. June 22, 2017).
33. Edelsberg v. Vroom, Inc., 16-cv-62734-GAYLES, 2018 U.S. Dist. LEXIS
500420 (S.D. Fla. Mar. 27, 2018).
34. Satterfield v. Simon & Schuster, Inc., 569 F.3d 946, 948 (9th Cir. 2009).
35. 47 C.F.R. § 64.1200(c)(2).
36. Id. § 64.1200(d).
37. Golan v. Free Eats, 930 F.3d 950 (8th Cir. 2019).
38. FTC 2013 Report, supra note 2, at 6.
39. See, e.g., Naomi Xu Elegan & Clay Chandler, When Red Is Unlucky: What
We Can Learn from China’s Color-Coded Apps for Tracking the Coronavirus Outbreak,
Fortune (Apr. 20, 2020), https://fortune.com/2020/04/20/china-coronavirus-
tracking-apps-color-codes-covid-19-alibaba-tencent-baidu/; Shlomi Eldar, Coronavirus
Crisis Exposes Shin Bet’s Secret Database, Al-Monitor (Apr. 1, 2020), www.al-monitor.
com/pulse/originals/2020/03/israel-palestinians-shin-bet-coronavirus-surveillence.
html; TraceTogether, www.tracetogether.gov.sg/ (last visited May 27, 2020); Eun
A. Jo, South Korea’s Experiment in Pandemic Surveillance, The Diplomat (Apr. 13,
2020), https://thediplomat.com/2020/04/south-koreas-experiment-in-pandemic-
surveillance/.
40. Apple and Google Partner on COVID-19 Contact Tracing Technology, Apple
(Apr. 10, 2020), www.apple.com/newsroom/2020/04/apple-and-google-partner-on-
covid-19-contact-tracing-technology/.
41. See Complaint, California v. Delta Air Lines, Inc., No. CGC-12-526741 (Cal.
Super. Ct. S.F. Cty. Dec. 6, 2012); see also supra chapter 2, Case Study: California v.
Delta Air Lines, Inc.
42. Att’y Gen. State of Cal., Joint Statement of Principles (Feb. 22, 2012) [here-
inafter Joint Statement of Principles], http://ag.ca.gov/cms_attachments/press/
pdfs/n2630_signed_agreement.pdf.
43. FTC 2013 Report, supra note 2, at 19.
44. Joint Statement of Principles, supra note 42.
45. See Google Play Developer Distribution Agreement § 4.3 (May 18, 2016),
https://play.google.com/about/developer-distribution-agreement.html; Microsoft
App Developer Agreement, Version 7.6, § 4(f) (Mar. 30, 2016), https://msdn.
microsoft.com/en-us/library/windows/apps/hh694058.aspx; Amazon App Distribution
and Services Agreement § 4 (Feb. 10, 2016), https://developer.amazon.com/
appsandservices/support/legal/da.
46. California 2013 Guidance, supra note 22, at 11 (“Describe Your Practices”).
47. Id.
48. Olivia Nuzzi, What It’s Like to Get Doxed for Taking a Bike Ride, N.Y. Mag.,
June 8, 2020, https://nymag.com/intelligencer/2020/06/what-its-like-to-get-doxed-
for-taking-a-bike-ride.html.
49. CalOPPA, Cal. Bus. & Prof. Code § 22577(b)(5).
50. FTC 2013 Report, supra note 2, at ii.
51. Id. at 22.
6–33
@ Practising Law Institute
Privacy Law Answer Book 2021
52. See Privacy Multistakeholder Process: Previous Meeting Information, NTIA
(Nov. 12, 2013), www.ntia.doc.gov/other-publication/2013/privacy-multistakeholder-
process-previous-meeting-information.
53. NTIA Short Form Notice Code of Conduct, supra note 26.
54. Id. at 2.
55. Id. at 4–5; see Awesome App Privacy Dashboard, http://privacydash
board.s3.amazonaws.com/index.html (last visited June 7, 2016); Joseph Jerome,
NTIA User Interface Mockups, Future of Privacy Forum (July 25, 2013), www.future
ofprivacy.org/2013/07/25/ntia-user-interface-mockups/.
56. See NTIA Short Form Notice Code of Conduct, supra note 26, at 4–6.
57. Id.
58. FTC 2013 Report, supra note 2, at iii.
59. NTIA Short Form Notice Code of Conduct, supra note 26, at 2.
60. FTC 2013 Report, supra note 2, at 15–16.
61. California 2013 Guidance, supra note 22, at 6, 12; FTC 2013 Report, supra
note 2, at 23.
62. FTC 2013 Report, supra note 2, at 23; California 2013 Guidance, supra
note 22, at 1.
63. See Complying with COPPA: Frequently Asked Questions, Fed. Trade
Comm’n (Mar. 20, 2015), www.ftc.gov/tips-advice/business-center/guidance/
complying-coppa-frequently-asked-questions, FAQ A(5).
64. California Consumer Privacy Act Regulations (proposed Feb. 10, 2020),
https://oag.ca.gov/sites/all/files/agweb/pdfs/privacy/ccpa-text-of-mod-
clean-020720.pdf?.
65. FPF/CDT Best Practices, supra note 1, at 6.
66. Press Release, Fed. Trade Comm’n, FTC Issues Warning Letters to App
Developers Using “Silverpush” Code (Mar. 17, 2016).
67. California 2013 Guidance, supra note 22, at 12 (“Enhanced Measures”).
68. Complaint, In re Goldenshores Techs., LLC, 2013 WL 6512819 (F.T.C.
Dec. 5, 2013).
69. In re Goldenshores Techs., LLC, 2014 WL 1493611 (F.T.C. Mar. 31, 2014);
see also Press Release, Fed. Trade Comm’n, FTC Approves Final Order Settling
Charges Against Flashlight App Creator (Apr. 9, 2014).
70. Cellular Telecomms. Indus. Ass’n [CTIA], Best Practices and Guidelines
for Location-Based Services (Version 2.0 Mar. 23, 2010), http://files.ctia.org/pdf/
CTIA_LBS_Best_Practices_Adopted_03_10.pdf.
71. FTC 2013 Report, supra note 2, at 23.
72. Id. at 23–24.
73. Id. at ii.
74. California 2013 Guidance, supra note 22, at 11 (“Describe Your Practices”).
75. Id. at 12; FTC 2013 Report, supra note 2, at 23.
76. Assurance of Discontinuance Under Executive Law Section 63,
Subdivision 15, In re Matis Ltd., No. 16-101 (N.Y. Att’y Gen. Feb. 13, 2017), https://
ag.ny.gov/sites/default/files/matis_aod_executed.pdf.
6–34
@ Practising Law Institute
Mobile Privacy
77. Cal. Consumer Privacy Act of 2018, A.B. 375, 2017–2018 Assemb., Reg.
Sess. (Cal. 2018) (enacted) (to be codified at Cal. Civ. Code §§ 1798.100 et seq.).
78. Press Release, Fed. Trade Comm’n, Mobile Phone Maker BLU Reaches
Settlement with FTC over Deceptive Privacy and Data Security Claims (Apr. 30,
2018).
79. Decision and Order, In re BLU Prods., Inc., File No. 172 30025 (F.T.C. pro-
posed Apr. 30, 2018).
80. Chloe Albanesius, Facebook Moves to Limit Auto-Sharing Apps, PCMag
(May 28, 2014), www.pcmag.com/article2/0,2817,2458633,00.asp.
81. FPF/CDT Best Practices, supra note 1, at 8 (“Enhanced Notice”); see also
David Murphy, Facebook Adds 10-Second Delay to “Frictionless” Insta-Post Apps,
PCMag (June 2, 2012), www.pcmag.com/article2/0,2817,2405222,00.asp.
82. California 2013 Guidance, supra note 22, at 9.
83. See, e.g., Google Play Developer Distribution Agreement § 4.3 (May 18,
2016) (“If your Product stores personal or sensitive information provided by users,
it must do so securely and only for as long as it is needed.”), https://play.google.
com/about/developer-distribution-agreement.html.
84. FPF/CDT Best Practices, supra note 1, at 12.
6–35
@ Practising Law Institute