Get Started

June 21, 2023

Issue 21: Texas makes 10

Oh Hey! Welcome to The Privacy Beat Newsletter!

Here’s the gist: Come here for insights on the hottest topics in privacy according to our peers’ tweets so you can walk into any happy hour or team meeting and sound like the absolute baller you are. No current topic gets by you!

Texas makes 10, and we've got less than a year on this one!

Texas Governor Greg Abbott signed the Texas Data Privacy and Security Act this week after the state’s legislature passed the bill over Memorial Day Weekend. Texas is the 10th U.S. state to pass a comprehensive bill, and it’s coming in hot: It becomes effective March 1, 2024.

As Keir Lamont reports on his blog, some call Texas the strongest privacy law in the land. It requires small businesses to get consent to sell sensitive personal data, it requires standalone disclosures for certain data sales, and it counts pseudonymous data as personal data.

The crazy part of the bill, in my mind, is that while it allows users to opt-out of some data uses, businesses are off the hook if they don’t “possess the ability to process the request” or don’t “process similar or identical requests” already, to comply with other state laws.

As a consumer, this seems weird and terrible. But it got me thinking: What a great excuse! I thought I’d try it out. I told my boss that based on this logic, I can’t do the task he’s asking me to do because it’s just not something I do on the regular. He told me, flatly: “Angelique, we’re not in Texas.”

I tried.

But hey, while we’re at it, because everything is crazy and who can keep track, here’s a straight-up list of state laws’ effective dates for you, in chronological order. Doesn’t it help to see them grouped? Or maybe it induces panic. I dunno.

California: In effect, obvs.
Virginia: In effect (Jan. 1, 2023)
Colorado: Effective July 1, 2023
Connecticut: Effective July 1, 2023
Utah: Effective Dec. 31, 2023
Texas: Effective March 1, 2024
Florida: July 1, 2024
Tennessee: Effective July 1, 2024
Montana: Effective Oct. 1, 2024
Iowa: Effective Jan. 1, 2025
Indiana: Effective Jan. 1, 2026

Ruh roh: After 3 warnings, FTC nails DNA testing site for privacy and security failures

Here’s the gist of the case: The FTC has proposed a settlement with 1Health.io, formerly known as Vitagene, alleging it mishandled sensitive genetic data and made retroactive changes to its privacy policy. While the DNA testing company claimed it didn’t store DNA results with identifying information, it was storing that data in publicly accessible buckets on AWS cloud servers. It didn’t encrypt the data, restrict access to it, or log or monitor access to it.

It also modified its privacy policy to “significantly” expand the companies it would share health and genetic data, and it did so without alerting users.

Finally, the FTC alleges that while 1Health.io promised that consumers could delete their personal information at any time and that it would destroy saliva samples shortly after analyzing them, the company did not implement policies to make either of those promises operational.

To be fair, the FTC warned the company three times since 2019 that it had problems. To make matters slightly worse, 1Health.io’s public-facing policy says, “Your DNA and health details are personal and private. We make your privacy our priority. We will not sell or publish your information with any third-party firm or partner. That is our commitment to you.”

As a result, 1Health.io will pay $75,000, is prohibited from sharing health data with third parties without express consent, and must implement a comprehensive information security program. In addition, it must require partner labs to destroy DNA samples.

Khan said the case illustrates the FTC’s recent warning that it would come after companies employing unfair or deceptive practices with the collection or use of consumers’ biometric information.

But the settlement also maps to the FTC’s recent enforcement trends to go after companies mishandlng sensitive data. You’ll recall the recent case against fertility app Premom, which followed the GoodRX and BetterHelp settlements. All those companies got nailed for, among other missteps, sharing sensitive information with third parties like Facebook for advertising. And while the FTC’s punishment included fines, the more damaging and long lasting settlement provisions forbids those companies from sharing data with third parties for advertising purposes in the future.

Point being: If you’re working with sensitive data, now’s the time to ensure you’ve got your proverbial house in order within your privacy policy and your vendor agreements.

As U.S. ponders, EU acts on AI Act

While a group of 23 U.S. attorneys general submitted a “comment letter” to the NTIA calling for transparency and accountability in AI policies here in the states, the EU continues to be like 10 steps ahead of us and has advanced its AI legislation.

Last week, European Parliament voted in favor of “the world’s first set of comprehensive rules for artificial intelligence,” as TIME reports. The EU started considering the AI Act in 2021, but when everything AI exploded over the recent(ish) ChatGPT hysteria, talks accelerated. The rules would classify systems according to risk, and riskier systems would be required to meet more stringent requirements.

It would forbid certain applications, including those that exploit vulnerable people or operationalize predictive policing tools.

The potential fines are higher than the GDPR and top out at 30 million euros or 6% of a company’s annual revenue.

The EU is aiming to lead the world on AI, despite the fact that its behind on AI development compared with China and the U.S., the TIME report adds.

“The fact this is regulation that can be enforced and companies will be held liable is significant” because other places like the United States, Singapore and Britain have merely offered “guidance and recommendations,” said Kris Shrishak, a technologist and senior fellow at the Irish Council for Civil Liberties.

For the U.S.’s part, the attorneys general told the NTIA – which asked for public comment on AI regulation in April – that we need policies that “put consumer protection front and center” and prioritize things like transparency, audits, and accountability. The NTIA advises the president on telecom and information policy matters, and hey, he’s paying attention.

Check out President Biden’s tweet! Always exciting to see the leader of the free world speaking our language.

Developing news, as of press time

Latest poddy

So we just, like, can’t transfer data?
Welp, the Irish DPC fined Meta $1.3 billion, the highest ever GDPR fine, and it ordered Meta to stop transferring data from the EU to the U.S. The implication, obviously, is that every other company using SCCs to transfer data is also in breach of the GDPR. But the problem is at the political level! We can’t solve this organizationally. So, what’s a company to even do? In this episode of The Privacy Beat Podcast, Eduardo Ustaran talks you off the ledge.

Check it out!

Just a chat you might enjoy watching

I recently had a nice chat with Brittany Rhyne, privacy analyst at Lyft, about how she's scaling her small team to keep pace with the business, a problem many of us face in privacy. She's leveraged tooling to integrate privacy with product & engineering at the design face, and she's been tracking her team's progress on the ways privacy is actually enabling the business, rather than being a blocker. Check it out for some insights on how to unlock your own potential in similar ways.
Watch this on-demand fireside chat!

Hot take of the week (cuz I'm a Paul fan):

Resources for you:

We know the privacy function isn’t a bottleneck or a “house of no.” In fact, strategic privacy programs have a positive impact on their organizations. But that requires embedding privacy into product and engineering teams to get the visibility you need, having a single source of truth on data flows, and finding ways to gather metrics to prove your value. Here’s a practical guide on how to do that.

Whitepaper on scaling your privacy team to keep pace with the business

There’s so much work to be done in privacy, it’s impossible to tackle on your own. Especially if you’re a small team, it’s essential to leverage the tools and tech available to help you scale. In this fireside chat, OfferUp’s Shannon Doniere talks about how she’s building her privacy program, what she automated in order to work more efficiently, and why she decided privacy by design was the way to get it all done.

Thanks for reading, loves! Please share it if you liked it! ♥️ And subscribe!

Loading GTM...