Loading...

September 15, 2022

Issue 7: Newsom hopes the kids'll be alright, signs law

Oh hey! Welcome to The Privacy Beat Newsletter!

Here’s the gist: Come here for insights on the hottest topics in privacy according to our peers’ tweets so you can walk into any happy hour or team meeting and sound like the absolute baller you are. No current topic gets by you. Did you post a hot take you want included? Tag it #PrivacyBeatNews and see if it makes it into the next edition!

Hello! Since we last spoke, PrivacyTwitter lit up after Instagram got slapped with a record-breaking fine, the FTC asked the public what it wants its future privacy rule to look like, and this Age-Appropriate Design Code caught folks on their heels.

Instagram got whacked with a record fine

First, the Instagram news, because everyone likes a good “gotcha” story – unless it your company, of course. It took years, but the Irish Data Protection Commission hit Instagram with a €405 million fine for mishandling children’s data, as Politico reported.

For years pundits have criticized Irish DPC Helen Dixon for perceived failures on enforcing the GDPR. The people wanted fines and they wanted them May 26, 2018, it seems. But Dixon’s mandate is vast; she’s got live investigations launched currently against Meta, Google, TikTok, Apple, Yahoo and Tinder. Still, critics have mumbled for years now that the GDPR isn’t effective because no one’s punishing bad actors. So the Instagram news broke like a glass vase and disseminated far and wide. “Dixon did it!” Was kind of the vibe.

Gavin Newsom as tempestuous cat?

As I mentioned last time, California doesn’t want privacy pros to sleep anymore, so it passed a children’s privacy law. Just as I’m writing this to you, the news broke that Gov. Gavin Newsom signed the Age-Appropriate Design Code Act, so now it’s official, and you’re gonna have to comply with it. If you’ve already been complying with the U.K.’s children’s code, your life won’t change drastically. But if you aren’t, and if anyone under the age of 18 regularly visits your site, you’re going to have some work to do.

Reactions have been, um, mixed.

For more on the California code – including what it means and why industry is outraged, check out the blog post I wrote on this, and this podcast I recorded with Eric Goldman at Santa Clara University School of Law.

‘I HATE DATA BROKERS’: Public weighs in on FTC rulemaking

Following its Aug. 11 announcement that it aims to rulemake on “commercial surveillance, the FTC held its first public meeting. While many say the choice of words is some virtue signaling, the FTC defines commercial surveillance as the business of “collecting, analyzing, and profiting from information about people,” which it says heightens the “risks and stakes of data breaches, deception, manipulation and other abuses."

The Sept. 8 meeting, which Meta declined to participate in, sought details on what areas the FTC should regulate and what terms to include. I won’t take you through the five hours of panels and public comment, but here are some of the highlights:

Algorithms are used to hurt people

Spencer Overton of the Joint Center for Political and Economic Studies said any FTC rule should include algorithmic transparency provisions. He’s concerned about automated ad delivery, which often uses bias-based algorithms to determine which users are most likely to engage with particular ads.

Upturn’s Harlan Yu agreed, adding that data uses manifest differently for historically disadvantaged users such as black/brown, LGBTQ and women. He suggested the FTC consider rules that would seal eviction filings before they can get into data broker’s hands, as well as sealing non-conviction arrest records and limiting the background checks writ large.

Secondary-uses of data are bad

EPIC’s Caitriona Fitzgerald said the FTC should use its unfairness mandate to cut down on “secondary uses” of data such as third-party disclosures.

The Future of Privacy Forum’s Stacey Gray noted pervasive practices like scraping data from real-time bidding auctions to create sometimes highly sensitive user profiles, for example, can be harmful to consumers.

“There’s lots of consensus on the need to reform advertising but little consensus on where to draw the lines in that very competitive landscape,” Gray said.

Notice and choice ain't working for us

The panelists agreed that notice and choice isn’t working. Gray said that, despite the ubiquity of privacy policies, there’s no federal standard on what they have to contain. And even though there’s plenty of evidence that consumers don’t actually read privacy policies, other folks – researchers, journalists, competing businesses, and regulators – do. And that matters for accountability purposes.

She said the FTC should consider codifying that the absence of adequate disclosures is an unfair practice and of course not avoidable by consumers.

The best part of the event, in my humble opinion, was during the public comments portion, when privacy attorney Heidi Salow Zoomed into the meeting and, as the FTC moderator played the proverbial Oscars “wrap it up” music, she yelled into the void, “I HATE DATA BROKERS,” just before her screen went black.

Stay tuned for more privacy fun next week!