Loading...

October 31, 2022

Issue 10: I want you to know these things

Oh hey! Welcome to The Privacy Beat Newsletter!

Here’s the gist: Come here for insights on the hottest topics in privacy according to our peers’ tweets so you can walk into any happy hour or team meeting and sound like the absolute baller you are. No current topic gets by you. Did you post a hot take you want included? Tag it #PrivacyBeatNews and see if it makes it into the next edition!

Personal liability: Always kinda sucks

For many years, while living in Portsmouth, NH, I taught spin classes. Anyone within a 15-minute drive of IAPP HQ regularly heard a sad pitch to try my class. Unfortunately, I'm the opposite of a salesperson, and I have zero game, so the pitch sounded something like, "You should try it sometime. You know, if you're bored, or whatever." What the pitch didn't reflect was my passion for moving with a tribe of people to a heavy beat in a dark room. The endorphins electrified me. But there were times, especially when the sweet woman who'd survived a heart attack came to class, that I'd worry I was working them too hard. If I saw someone grinding it out to Jay-Z's final eight beats, looking like they might fall off the stationary bike, the thought would sometimes cross my mind: "What if they get hurt? Does my contract say I could be held responsible?"

That sounds crass, and of course, I'd be more concerned with their wellbeing than any liability. But when you're offering a service to the general public, these things cross your mind.

It's as close as I can get to how Drizzly's CEO might be feeling. The FTC has hit Boston-based Drizly, an alcohol-delivery service that Uber owns, with an enforcement action. The agency says Drizly knew about its security problems two years ago, and a couple of breaches exposed 2.5 million customers' account information.

The FTC's fix here imposes significant requirements on Drizly, and we're used to that. But the zinger here is it also changes things for CEO James Rellas. The enforcement action, released Oct. 24, requires Drizly to:

  • Destroy any data non-essential to the service it provides.
  • Implement an information-security program.
  • In the future, only collect and store data necessary for its services.

As for Rellas, he may feel like he's perpetually living through winter in Seattle; you know, a light-but-persistent drizzle overhead. (I had to, I'm sorry.) The FTC's order "follows Rellas even if he leaves Drizly." Wherever he's next CEO of a company collecting more than more than 25,000 individuals' personal data, he's on the hook for the same implementation requirements. FTC Chair Lina Khan and Commissioner Alvaro Bedoya wrote in a memo the future-proof decision reflects that "corporate executives sometimes bounce from company to company, nothwithstanding blemishes on their track record."

Commissioner Christine Wilson voted against holding Rellas personally liable. She said CEOs "must be allowed to decide for themselves whether or not to pay attention to data security." But the majority said, "Nah." More eloquently put, the majority memo states, "Overseeing a big company is not an excuse to subordinate legal duties in favor of other priorities," and that the FTC "has a role to play" in making sure info-security obligations reach the boardroom.

That seems to be where we're heading. Remember when the U.S. Senate put Equifax's CEO on display for a public lashing? I was there, it was awkward. Senators repeatedly spit out the word "failure" like sunflower seeds in a dugout. Equifax CEO Richard Smith had "retired" by the time he faced lawmakers. But this imposition on Rellas, coupled with the American Data Privacy and Protection Act provision that calls for C-suite liability, indicates an upward trend. Perhaps one way we nudge companies to prioritize data privacy and security is to point a dart at top decision makers. Make it personal. Should we have to do it? No. But to date, the lure of data-driven profits seem to have led some executives astray from protecting people. So here we are.

We took your face, here's some cash!

Speaking of trends, some U.S. TikTok users woke up to the clanging of quarters hitting their proverbial piggy bank bellies last week. That's because TikTok settled 21 class-action lawsuits, filed on behalf of minors, alleging the company violated Illinois' Biometric Information Privacy Act by taking facial scans to use for targeted marketing later. As Innovation & Tech Today reports, TikTok's terms of service technically allow it to collect the biometric data, but Illinois was like, "I don't care. Those terms violate the state's law. Delete it. Pay up. Bye."

Up to 89 million users qualified to submit a claim and received between $27.84 and $167.04 as a result. In DC, that's like, dinner for two. And it's not a gift, you technically paid for it with your most precious data point of all, the one you can't ever change. But hey, surprise deposits are fun.

It's widely reported we should expect more of these settlements in the near-term. Facebook paid Illinois residents $650 million in total after a similar lawsuit in May, and last month, a judge approved a $100 million settlement against Google for BIPA violations. Snapchat's facing a $35 million BIPA lawsuit in November, and Clearview AI settled its case in May.

California's privacy agency still on that grind

California's Privacy Protection Agency is a beast. It makes more news than the FTC these days, if we're measuring by Tweets. Its latest headline: The Global Privacy Assembly voted on Oct. 27 to admit it as a full voting member. The group, enacted in 1979, held its 44th annual meeting in Turkey last week. CPPA's Executive Director, Ashkan Soltani, delivered the keynote.

The CPPA joins just one-other U.S. voting member in the group, the FTC. While CPPA's admission to the club doesn't directly impact the landscape, it gets the CPPA in the rooms with key regulators; the ones who'll determine the future of privacy enforcement.

Some have criticized the agency for campaigning against the American Data Privacy and Protection Act, noting Californians shouldn't be the only U.S. citizens with privacy rights.

Did you miss the latest Privacy Beat Podcast drops?

Your PSR recap, for those with FOMO

In this episode, host Angelique Carson brings you live interviews from the show floor at the IAPP’s P.S.R. 22, including what everyone was buzzing about, what happened at the big Rodeo party, and how to avoid serial killers.

Listen here

He basically took down Cambridge Analytica

Prof. David Carroll, featured in the Netflix Documentary, “The Great Hack,” swung by The Privacy Beat Podcast to discuss his crusade for data protection rights in the U.S. and suing Cambridge Analytica in the wake of the 2016 presidential election.

Listen here

California’s new children’s law could dismantle the status quo: Is that bad?

Last week, Eric Goldman visited the podcast to rip California’s Age-Appropriate Design Code to shreds. Some of you did not like that. On this episode, Stanford University’s Dr. Jen King has a different take, “We’ve had nearly thirty years of design masquerading as being values-agnostic driving the development of the internet. Do we really want to defend this status quo?”

Listen here

Twitter hot take of the week goes to (it's a tie):

It hurts because it's true, you know?

"So many branded socks." Guilty.

HEY! Will you be at the Solove/Schwartz event in DC next week? I'll be there. I'm speaking on Thursday, Nov. 3, at 10:10 a.m. (super specific, I know). We're talking about "How to tackle privacy earlier in product development." Come say hi if you're around?

See you soon! xo