Privacy Law

Adtech Privacy Update Vol. 4: DSA, ATT, Amazon’s Alexa, and Google ‘Data Safety’

Please share:

May 2022

By McKenzie Thomsen

Lately, it seems with every passing month, the adtech world is more and more shaken up by privacy regulation. When I sit down to write these updates, I make a list of what has happened in adtech privacy since my last article, and every time I end up with a ridiculously long list. It reminds me of the song “We didn’t start the fire” by Billy Joel. It’s an ever-growing list. There’s so much on the list that there’s no time to dig into each topic. (I guess that could be said about privacy in general too). Even so, I’m going to try and elaborate on four items from my list:

(1) the Digital Services Act’s impact on digital advertising;

(2) Amazon is selling transcripts of your Alexa interactions to serve you ads; and on the mobile side of things;

(3) Researchers found Apple’s ATT… is ineffective; and

(4) Google has released the ‘Data Safety’ section (it’s version of Apple’s Privacy Nutrition Labels)

The DSA is about to change digital advertising in the EU

You’ve likely already heard of the EU’s Digital Services Act (DSA). On Saturday April 30, the final terms were agreed upon. We won’t know the final text for a little while, but here’s what we know so far. The DSA is designed to impose legally binding content controls on digital platforms, specifically related to illegal content, transparent advertising, and disinformation. ‘Very Large Online Platforms (VLOPs), and ‘Very Large Search Engines (VLSEs) (think Google, Meta, Amazon) have heightened obligations.

Let’s get into the effects the DSA will have on adtech (originally written by Eric Seufert). The DSA will:

  • Ban targeted advertising to minors. The specifics around ‘knowledge’ of a user’s age are still unknown.
  • Ban the use of sensitive data for targeted advertising. Digital platforms “shall not present advertising to recipients of the service based on profiling… using special categories of personal data” as defined by the GDPR.
  • Require digital platforms provide users with meaningful information about how their data will be monetized, and an opt-out mechanism. And unsurprisingly there’s a “dark patterns” aspect. “Refusing consent shall be no more difficult or time-consuming to the recipient than giving consent.”
  • Require digital platforms disclose the sponsor of an ad and the targeting parameters used in serving the ad.
  • Require VLOPs to maintain records on targeted advertising. Specifically, VLOPs must maintain, (a) the content of the ad, (b) the sponsor of the ad, (c) the period during which the ad was exposed, (d) the targeting parameters used in serving the ad, and (e) the total number of people to whom the ad was exposed, broken out by targeting group. All this data must be made available via API.

Big changes are coming, and we’ll see how digital platforms adapt (or exit the EU). The DSA’s restrictions will go into force 15 months after being voted into law or on January 1st, 2024, whichever is later.

Amazon is selling your Alexa interactions to provide you with target ads

This may not be surprising, but researchers conducted a study and found that Amazon and third parties (including advertising and tracking services) collect data from your interactions with Alexa and share it with as many as 41 advertising partners. The shared data is then used to infer user interests and serve targeted ads on Amazon platforms as well as non-Amazon platforms (think, the web generally). This type of data is in demand. Researchers found it tends to get “30X higher ad bids from advertisers.”

Amazon confirmed that it does use voice data from Alexa interactions for targeted advertising but says that it doesn’t share the recordings, but transcripts of interactions. Admittedly, that’s better, but it would be best if Amazon actually disclosed their data practices. And of course, Amazon didn’t just admit to these practices outright, a spokesperson refuted the study altogether stating, “many of the conclusions in this research are based on inaccurate inferences or speculation by the authors, and do not accurately reflect how Alexa works.”

Apple’s ATT is ineffective (we figured, but now it’s confirmed)

In a study called “Goodbye Tracking? Impact of iOS App Tracking Transparency and Privacy Labels,” researchers disclosed the lackings of Apple’s App Tracking Transparency (ATT). Below are the (crazy) highlights.

  • Apps are still tracking. Apparently, ATT had very little impact on apps tracking users. Oof. In some cases, tracking libraries are contacted ‘at the first app start’ which is an indicator that apps are ‘tracking’ prior to the ATT consent request. (Tracking libraries track events (e.g., when a user clicks on a link, or moves to another page) and sends that information to a third party via an API).
  • Some apps use a ‘User ID’ and collect location. This information can be combined with other information to build a device-specific profile on a user. This circumvents ATT and contradicts its purpose entirely.
  • Apps are blatantly fingerprinting. Apps are creating their own User IDs and sharing them with third parties (who are receiving User IDs from other apps) and identifying the user with other data points. Apple has been informed about this and has done nothing to stop it. 
  • Many apps’ ‘Privacy Nutrition Labels’ are inaccurate and contradict their posted privacy notice. 
  • Apple is tracking you for profit. Apple has admitted it is collecting significant device-specific information (that through ATT it won’t allow apps to collect) and combining it with other information to build advertising cohorts on Apple’s SKAdnetwork.

Google’s ‘Data Safety’ section: better or worse than Apple’s ‘Privacy Nutrition Labels’?

Originally announced in May 2021, Google has begun rolling out the ‘Data Safety’ section (Google’s version of the Apple Privacy Nutrition Labels).

Google’s Data Safety section requires app developers to disclose what type of data is collected, the purpose for collection/use/sharing, whether that data is shared, security measures taken to protect the data, and whether the app has committed to following the Google Play’s Families Policy. Developers can also choose to disclose whether the security practices they’ve implemented have been validated by a global security standard (such as the Mobile Applicable Security Assessment (MASV).

In contrast, Apple’s Privacy Nutrition Labels are more formulaic. The Privacy Nutrition Label is divided into 3 sections: (1) “Data Used to Track You”; (2) “Data Linked to You”; and (3) “Data Not Linked to You.” For each section, the developer must state the types of data collected/used by the developer and/or any third-party partners as well as for what purpose. The main difference is that Apple has predetermined exactly what ‘types of data’ and ‘purposes’ are available for app developers to enter so a developer must attempt to match their practices with one or more of Apple’s predefined types of data and purposes, regardless of whether there are discrepancies with the term being applied or ‘gray’ areas in how they’re applied (e.g., this app may ‘track’ as defined by the GDPR, but not as defined by the CCPA).

In contrast, Apple’s Privacy Nutrition Labels are more formulaic. The Privacy Nutrition Label is divided into 3 sections: (1) “Data Used to Track You”; (2) “Data Linked to You”; and (3) “Data Not Linked to You.” For each section, the developer must state the types of data collected/used by the developer and/or any third-party partners as well as for what purpose. The main difference is that Apple has predetermined exactly what ‘types of data’ and ‘purposes’ are available for app developers to enter so a developer must attempt to match their practices with one or more of Apple’s predefined types of data and purposes, regardless of whether there are discrepancies with the term being applied or ‘gray’ areas in how they’re applied (e.g., this app may ‘track’ as defined by the GDPR, but not as defined by the CCPA).

Google’s ‘Data Safety’ sectionApple’s Privacy Nutrition Labels
Required Disclosures● Data collected
● Purpose for data collection
● Whether data is shared
● Security practices
● Whether a user can request data deletion
● Whether app has committed to following Google Play’s Families Policy
● [optional] Whether security practices have been validated by a global security standard
● “Data Used to Track You”
● “Data Linked to You”
● “Data Not Linked to You”

For each section, the developer must state:
● the types of data collected/used by the developer; and/or
● any third-party partners as well as for what purpose.

Both the types of data and the purpose for collection/use are predetermined and predefined terms that app developers select from.
Enforcement MeasuresSelf-attestation, however, Google says it will verify.Self-attestation, but since bad press, has stated they will routinely audit.
Effective DateJuly 20, 2022, but the section is already rolling out (so any app that has not completed the section will be listed as having “No info available.”December 8th, 2020.

We’ll see how trustworthy these are with time (not to mention how useful they are to customers).

Conclusion

As you can see, there were lots of Billy Joel ‘fires’ this month. And this is just the highlight reel.


Forgot Password

Enter the email associated with you account. You will then receive a link in your inbox to reset your password.

Personal Information

Select Section(s)

CLA Membership is $99 and includes one section. Additional sections are $99 each.

Payment