Loading...

December 9, 2022

Issue 12: Big tech's big trouble

Oh hey! Welcome to The Privacy Beat Newsletter!

Here’s the gist: Come here for insights on the hottest topics in privacy according to our peers’ tweets so you can walk into any happy hour or team meeting and sound like the absolute baller you are. No current topic gets by you. Did you post a hot take you want included? Tag it #PrivacyBeatNews and see if it makes it into the next edition!

Big tech’s big troubles persist

Welp. The deluge of news on big tech’s regulatory reckoning continues. This week, the European Data Protection Board announced that Meta shouldn’t require users to use personalized ads. To date, Meta’s been assuming user consent for the practice via a provision in its privacy policy – versus explicitly. Though we’re now waiting on the Irish Data Protection Commissioner’s final decision on this, The Wall Street Journal reports the ruling could put the squeeze on the data Meta can use to sell ads.

Here’s a hot take: Collect more data

Article from the Hill on why your privacy should not be so private.

“It has not been a good few months for tech companies. Meta shares have plummeted, Twitter is struggling under Elon Musk’s ownership, layoffs have abounded and the cryptocurrency exchange FTX has imploded. The mood towards tech has soured, and it is affecting U.S. policy in outsized ways.”

So goes an opinion piece for The Hill in its opening. Its author, Orly Lobel, a law professor, argued against the Biden administration’s Blueprint for an AI Bill of Rights. And while her opening isn’t untrue, her thumbs-down on the bill’s provisions on strengthening privacy and preventing algorithmic bias/discrimination felt ... ahistorical.

“But the federal framing of artificial intelligence largely overlooks the advantages of machine decision-making and fails to recognize the tension between privacy and progress through collecting richer, fuller, more diverse data sets.” She continues, "data collection and monitoring can make the root causes and patterns of inequality more visible and help ensure that those most needing resources, protection and support receive it. "

If you’ve read the research on how black women are canvassed for data that could incriminate them before they can access Medicare, or the way landlords use screening services to disproportionately deny Black and LatinX tenants housing, this is a tough take. Am I missing something? Earnest question.

I'm all for inclusion, but, allowing companies or governments to collect whatever data they can get their paws on — whether in the name of progress or less earnest endeavors — is how we got into this mess to begin with, no?

Count it: Apple ditches photo scanning for CSAM

Last year, a loud faction of our privacy community grabbed their proverbial pitchforks and raised some hell when Apple said it would use cloud-scanning technology to detect instances of child porn. The argument being, of course, that yes we want to detect the monsters storing that kind of imagery, but scanning every users photos on iCloud wasn’t the way to do it.

After initially pausing the plan, Apple announced this week that it has killed the CSAM plan altogether.

The states want the kids to be alright

If you listen to The Privacy Beat Podcast (and if you don’t, you should check it out here!), you know that I took a ‘lil heat after an interview with Santa Clara University Law School’s Eric Goldman. He had nothing nice to say about California’s Age-Appropriate Design Code, and some of you were offended. Goldman’s feelings aside, the code passed California’s legislature unanimously. While we love to talk about the state’s being “labratory’s of democracy,” let’s be real: They’re also total copycats. Wildly uninventive. After California passed the CCPA, we now see a domino effect. California, Connecticut, Utah, Virginia, and Colorado have all passed similar legislation. And hey, if you’re a privacy advocate, that’s great news! If you’re a CPO, times are hard.

In the not-too-distant future, we’re likely to see several state bills on children’s privacy. New York introduced a bill in September, and this week, New Jersey introduced its own version. So, maybe we won’t get federal privacy signed into law anytime soon, but it’s clear privacy has earned a spot on legislative floors for the foreseeable future.

Anyway: How’s your week? Slow?

See you soon! xo, Angelique

Did you miss the latest Privacy Beat Podcast drops?

CA Deputy AG Stacey Schesser on enforcing America’s flagship privacy law (Part 1)
In this interview (part 1 of 2), host Angelique Carson chats with California Deputy Attorney General Stacey Schesser on how everything changed with the CCPA. Schesser talks about the agency’s recent Sephora enforcement action, Global Privacy Controls, and how she’ll work with the newly-established CPPA. It’s a Privacy Geek’s buffet, if you will.

CA Deputy AG Stacey Schesser on enforcing America’s flagship privacy law (Part 2)
In this episode, part 2 of 2, California Deputy Attorney General Stacey Schesser talks about what she thinks the attorney general could have done differently, the Sephora case, and what’s going on with operationalizing Global Privacy Controls.

How to do TIAs so they're not a P in your A

In this episode, Julian Flamant, an attorney at Hogan Lovells and longtime pal of Angelique’s, talks about Chicago-based mobsters, that looming CPRA deadline, and how to keep transfer impact assessments, TIAs, from becoming a P in your A.

Listen here