AHLA's Speaking of Health Law

Evolving State Privacy Laws and the Impact on Digital Health Innovation

American Health Law Association

Omenka Nwachukwu, Principal Consultant, Privacy and Compliance, Clearwater, speaks with Kaitlyn O'Connor, Co-Founder and Partner, Elevare Law, about the growth in state privacy laws covering wide ranges of health data and how digital health companies are being impacted. They discuss the role state legislation is playing in addressing gaps left by federal health care privacy laws, how state privacy laws are going beyond the Health Insurance Portability and Accountability Act (HIPAA) in certain areas, and trends in state regulatory activity. They also discuss how digital health companies can ensure compliance across multiple jurisdictions while maintaining innovation, adapt to a broader definition of health data under state laws, and navigate operational and technical challenges in implementing state-specific privacy requirements. Sponsored by Clearwater

AHLA's Health Law Daily Podcast Is Here!

AHLA's popular Health Law Daily email newsletter is now a daily podcast, exclusively for AHLA Premium members. Get all your health law news from the major media outlets on this new podcast! To subscribe and add this private podcast feed to your podcast app, go to americanhealthlaw.org/dailypodcast.

Speaker 1:

<silence>

Speaker 2:

Support for A HLA comes from Clearwater. As the healthcare industry's largest pure play provider of cybersecurity and compliance solutions, Clearwater helps organizations across the healthcare ecosystem move to a more secure, compliant and resilient state so they can achieve their mission. The company provides a deep pool of experts across a broad range of cybersecurity, privacy, and compliance domains. Purpose-built software that enables efficient identification and management of cybersecurity and compliance risks, and a tech enabled 24 7 365 security operations center with managed threat detection and response capabilities. For more information, visit clearwater security.com.

Speaker 3:

Hello everyone. This is Omeka and Waku , principal consultant with Clearwater's Privacy and Compliance Consulting team. Over the past few years, we have seen enormous growth in state privacy laws, which cover wide ranges of health data, mostly outside HIPAA laws like the Maryland Online Data Privacy Act create substantial challenges before health information can be used for virtually any purpose. Developing an approach to consumer consent that some have commented is the opposite of what HIPAA requires. Others have stated that the New York Health Information Privacy Act, if enacted, could create a chilling effect on patient access and engagement to digital health services relied upon by New Yorkers. As digital health companies face hurdles in improving their products and services due to the financial and operational burdens created by the proposed act, it's a challenging time for healthcare legal and compliance professionals as we are seeing multiple laws impacting the same data, depending on the specific role the data plays in a particular context. Joining me to help decipher the current state of healthcare data privacy regulations is Kaitlyn O'Connor, co-founder and partner with the firm VE Law, where she guides virtual first data, analytics, medical device, and other emerging healthcare models through complex legal landscapes. It is so amazing to be able to speak with you. Caitlyn ,

Speaker 4:

How are you doing ? Thank you. I'm doing well. It's great to be here. I'm excited to dig in.

Speaker 3:

Me too. So, let's dive in. Um, first let's talk about state level legislation. So what role should state level legislation play in addressing any gaps left by federal healthcare privacy laws?

Speaker 4:

Sure. So I, as I was, you know, preparing for this discussion, I was thinking about this a lot. And I think that the biggest gap that state privacy laws can address or can fill is the gap between consumer expectations and the actual legal obligations of the companies that are collecting and processing sensitive health information. So, a lot of the lawyers listening to this probably know that HIPAA really only applies to covered entities and their business associates, but consumer wellness apps like Apple Watch and Apple Health data , uh, apps and the AA ring and , um, fertility tracker apps likely are not covered entities or business associates. And so they're actually not subject to hipaa and consumers may not recognize that distinction. And so I think that, you know, the, the, one of the, one of the gaps that state lives , uh, state privacy legislation can fill and is actively filling, is being more transparent and clear with consumers about how their data is being used, what their rights are in that data, and you know, who is collecting it and who is accessing it. And , and so I think that's probably the biggest one. I think sort of a , a secondary gap that I think state privacy legislation is also filling is a gap between provider or covered entity ownership of patient's health data or consumers health data and consumer individual ownership of a person's health data. Mm-hmm <affirmative> . So the other area where I see my friends and family not fully understanding the distinction between HIPAA and state privacy laws is a lot of people don't realize that your doctor owns your data, your medical record data. So if your doctor is creating a medical record or collecting blood pressure data about you or collecting other types of health data, your provider owns that. And while you have a right to access it and to get a copy of your medical records , you actually don't own it. And, and whereas under state privacy law, most of the laws that we're seeing today explicitly state that consumers or individuals own their health information and any other sensitive or identifiable information that , uh, that consumer apps might be collecting about them. So, so, you know, just to sort of round that out, I think the , the two major gaps are the, the gap between consumer expectations and the legal obligations that companies who are collecting and managing data actually have. And then the second is who actually owns that data? That , and again, that's sort of a consumer expectation, I guess, but , but , um, but yeah, that gap between, you know, who actually owns that data and , and what do users know about who owns that data.

Speaker 3:

Amazing. Thank you so much. Um, and when we're talking about gaps , um, between how state privacy laws are , uh, are , we're talking about how state privacy laws are addressing those gaps, do you feel that there are any instances of state privacy laws going beyond HIPAA in certain areas? And if so , um, what does that mean for healthcare data governance and , um, patient rights?

Speaker 4:

Yeah, absolutely. I think that , um, there are a bunch of ways in which state privacy laws are going beyond HIPAA in order to fill those gaps we just talked about. The first one is broader applicability to companies that are not necessarily considered covered entities or business associates under hipaa. Mm-hmm <affirmative> . So again, companies that are selling fertility tracker mobile apps and Apple Health , uh, the Apple Watch and Apple Health data. Now those companies that aren't subject to HIPAA are subject to these state privacy laws. So we're seeing broader applicability to individuals or companies that are collecting and processing consumers data. I think we're also seeing a broader definition of data that is protected. Yes . Um , so, you know, if we think about protected health information, PHI under hipaa, a lot of times that is thought of as medical record data conversations about healthcare between a provider and a patient that's collected, you know, during a visit or that happens during a visit. We don't necessarily think about inferences about that data. We don't necessarily think about browsing behavior on health related websites, which is something that is explicitly mentioned in some state privacy laws or location data near a clinic. Now, location data is an identifier under hipaa. So technically location data is potentially PHI. But when you're tracking that data through tracking technologies that are tracking a , a user's browsing behavior and where they are when they're looking at a certain website, that's not necessarily something we think about under hipaa. But that is something that state privacy laws are explicitly mentioning, which is kind of interesting. Right. They're getting a lot more specific than HIPAA is, and frankly, HIPAA is intentionally broad and we can talk about that in a minute. But , um, they're getting a lot more specific about the types of data that are subject to these state privacy rules. Um, and you know, we did see a year or two ago guidance from OCR about tracking technologies, but that's not explicitly mentioned in the law . So that is guidance that has come out that has not made its way into the actual statutory or regulatory language that, that we are , that, you know , uh, covered entities and business associates are governed by. So I think HIPAA's moving in that direction, but state privacy laws are already there. Um, a couple of other ways, more rights for consumers. So again, like we said, under hipaa, providers own the data that they collect and manage about patients under state privacy laws, users or consumers own their own data. And as part of that ownership, they often are getting more rights with respect to that data. So, for example, lots of state privacy laws, in fact, maybe all of them, although don't quote me on that, I'm, someone will hear that and be like, actually , um, mm-hmm <affirmative> . I did not confirm that before this, but many state privacy laws are giving a private right of action to consumers. So if you as an individual are sharing your data with a fertility tracker app and that app has a breach, you may be able to actually sue that, that fertility tracking app. Or if they aren't clear with you about how they're using their data, using your data, and they use it in a way they didn't tell you about, you might be able to take action. You might be able to recover from them. Whereas under hipaa, if a provider has a breach, you can't necessarily sue your doctor for that breach. Right . That's in the government's hands to address. So that's a really big one that private right of action. And I think that goes back to what you were mentioning earlier, omeka about some of the, you know, challenges and some of the fear around how broad these state privacy laws are is, you know, there's a lot more risk for companies that are managing that data. And I think that's intentional. I think that's what the state governments want, right? They want mobile apps to be a little bit scared about the penalties that they might be subject to if they have a breach or if they don't comply with the law. Um, but I think for , for consumers, that also gives us a lot more confidence in the apps that we're using and where we're putting our data and what's being done with it. Because we know that we will potentially, hopefully be able to recover if someone that we are trusting does something wrong with that data. Um , so that's a big one. Um, writes to delete data. So you can reach out to a mobile app and you can say, delete all of my records. And they have to do that. There are stricter consent and use limitations. For example, the Maryland Privacy Act prohibits geofencing within 1,750 feet of any mental health facility or reproductive or sexual health facility to identify, track or collect data from or send notifications to consumers regarding their health data. So you can probably tell I read that from notes that I have in front of me 'cause it's really, really specific and those very specific. Yeah. And those strict and really specific rules are a major trend in these state privacy laws where, like we said, HIPAA's intentionally broad to, to avoid over prescription, to avoid having to change HIPAA all of the time. State privacy laws are not doing that. They are very, very strict and they are very specific about the types of behaviors that they prohibit. So that's kind of an interesting one, right? That's a gap that HIPAA doesn't fill. That state privacy laws are kind of coming in and filling. Um, and then I'll just mention two more. I know this is a lot. There are , you know , we could probably talk about this specific question, this for a long time. This , um, but two other ways that I think , uh, state privacy laws are sort of filling gaps that HIPAA doesn't fill are with a focus on transparency. So we know, again, you can request a copy of your medical records from your doctor, but what HIPAA doesn't say is what has to be in your privacy policy. There's a notice of privacy practices, but that's not always applicable to business associates. So if your doctor has a business associate that is doing something with data, the notice of privacy practices might say, Hey, we can share your data with third parties for operational purposes or treatment purposes. But under state privacy laws, the rules say that controllers have to provide clear privacy notices detailing the categories of personal data that they're processing, the purposes for processing. So not just this is an operational thing, but actually, hey, we're sharing your data with this third party, or we're processing your data to , um, you know , uh, to, to analyze it and provide you recommendations for supplementation, supplements you should take, or things like that. Being more specific about why you're processing that data and being explicit with consumers about how they can exercise their rights. So again, a lot of that ends up in a notice of privacy practices, which is required under hipaa, but it's not as specific and it's not as explicit and it's not required by business associates who are using and processing your data that they get from healthcare providers. And then lastly, specific nods to new technology like ai for example, that Maryland law that we mentioned before and that you mentioned in the intro specifically mentions mental health chatbots and specifically says, if you have a mental child mental health chatbot, we're looking at you, we're paying attention to you. All of the rules that we say in this law specifically apply to you and you have to therefore comply with them. So, so that's kinda interesting one too, where if you listeners out there are familiar with the proposed revisions to the HIPAA security rule, you might know that there are explicit mentions to new technology in that rule. There's some, there's some discussion about AI that there's some discussion about extended reality like VR and ar, but none of those have made it to the actual text of the rule, to the statute or the regulations. And there's not really a defined timeline as far as how soon they will be, if at all. So while HIPAA is sort of slowly making its way in that direction, state privacy laws are already doing, they're already saying, we know that you AI companies exist. We know that the fertility trackers exist and have this data, and we're paying attention and you are the ones that we are talking to in these rules that, that we're drafting. So that was a lot. Um, I'm sure there are more, but I think, you know, suffice it to say there are a lot of gaps in HIPAA that state privacy laws are filling. And again, that just goes back to the operational difficulty where digital health companies are seeing this, this level of specificity in saying, we've never had to deal with this before. How do we, how do we manage it?

Speaker 3:

Exactly. Wow. That was amazing, Caitlyn , thank you so much. Um, and it really sounds like , um, state regulators , uh, or state privacy laws, excuse me, are going beyond hipaa , um, in a more prescriptive way and in a faster way, potentially a more efficient way. Um, but we'll, we'll look at, we'll see how that pans out , um, as time goes on. So as we're comparing state privacy laws versus the HIPAA statute , um, I wanna ask you one question. How are state regulators enforcing healthcare data privacy laws differently, and what trends are we seeing in penalties or audits? You kind of started on this path with talking about , uh, uh, individual's right to action, which they don't have under hipaa.

Speaker 4:

Yeah, I think that the, the, the primary way that their , that states are approaching enforcement is they're being way more proactive about auditing and enforcement. So under hipaa, most of the time, an audit or an investigation by OCR who enforces HIPAA is going to be a response to a breach that a provider or a business associate proactively reported to OCR. So they're getting reports all the time and they're saying, this is a big breach, or this is a pattern, let's go look at this more closely. State governments are not waiting for those reports. They're actively going out to the companies that they know are subject to rule to the rules that they've set forth. And they're saying, let me see your documentation. Let me see how you're using this data, and let me make sure that your privacy policy, for example, is explicit about that and is transparent enough about that. And just to give you another example of this, right? In 2023, California, the Attorney General in California, which CCPA kind of set the stage for state privacy laws, California was the first state to do it. They were really specific about it. And a bunch of other states have followed suit. So in 2023, the California Ag targeted specific mobile apps that were collecting reproductive health data mm-hmm <affirmative> . And when they, after they did their investigation, they issued fines for failure to honor opt-outs, improper disclosures, and misleading privacy notices. So they went out and they said, your privacy notice is not as explicit as it needs to be. It's not as transparent as it needs to be. It looks like you're doing something else with this data that you haven't told your users about. And so we're gonna find you for that. And not to say that doesn't happen under hipaa. Right? But again, I think it's just more proactive by the state ag to look, you know, to look around at company's websites or download mobile apps and look at the privacy notice and say, Hey, I think this is something we should look more closely at. So I think that's the biggest one, how they're, you know, the biggest way that they're, they're approaching enforcement a little bit differently than OCR or other federal agencies that might be looking at this.

Speaker 3:

That definitely makes sense. It's like they're saying, we're gonna come find you instead of waiting for you to find us <laugh>.

Speaker 4:

Yeah. It's more similar to how the FTC is operating, right? The FTC does this, OCR doesn't necessarily, so, so it's sort of, I think the ag is sort of blending that OCDR role and the FTC role to say, these are the things you have to do and also this is how we're gonna enforce it, and this is what, what you have to say about it.

Speaker 3:

Nice. Thank you so much. Um, now I'm really excited to go into this next part of our conversation. Let's see how digital health companies, which I know is your area of expertise. Let's see how they fit into this equation. So our first question in this section , um, with a growing patchwork of state privacy laws, how can digital health companies ensure that they are complying across multiple jurisdictions while maintaining innovation? How's that gonna work?

Speaker 4:

Yeah, so this is a big one. This is something I talk to my clients about all the time. I will say it tends to be an ongoing conversation, right? I'll respond to this as quickly as I can here , but there , I'm getting questions about this all the time. We're getting on calls sometimes weekly as these new laws are coming out. But I think some of the key themes in the way that I try to advise my clients is, first and foremost, keep your terms of use and privacy policy as broad as possible. Now, we just talked a lot about specificity and transparency and specific things that have to be disclosed in your privacy policy. And I'm not saying don't do that mm-hmm <affirmative> . But what I am saying is get creative and be as broad as possible so that as your technology changes, as the things you're doing change, you don't have to go in and revise your documentation every single time. A lot of times in your terms of use or your privacy policy, you're also committing to notifying your users anytime you change the terms of that terms of use and privacy policy. And in fact, sometimes you're legally required to notify users anytime you make a change. And so you wanna avoid making changes too frequently because that can confuse your users, that can lead to missed details that can, you know, put you in a box where you are spending a lot more time updating those policies than you really want to or really need to. And sometimes that can put you at more risk. Sometimes that gives your competitors more insight into how you're managing compliance with all of these very specific laws, right? If you're too specific, you might have a competitor say, oh, we didn't know you could do it that way, but it looks like this, companies do it, let's do that. You don't always have to be that specific. So, so I would say, you know, keep your terms of use and privacy policy and your notice of privacy practices, whatever it may be, as broad as possible while still maintaining compliance. And then I would also say, you know, build efficient workflows, and this is broad and it seems kind of straightforward, but the , but the biggest challenge that I see with my digital health companies is complying with consumer requests in an efficient way, because under hipaa, individual patients don't have as many individual rights as consumers do. Under state privacy laws, digital health companies that have been focused on HIPAA for a long time aren't used to having to delete patient records in response to their requests and confirming to, I'm sorry, consumer, I don't wanna say patient when we're not talking about hipaa , um, they're, they're not used to having to managing individual requests for data deletion and having to comply with that. If you're a business associate, which many digital health companies are, they're often agreeing in the BAA to notify the covered entity if they get that request. And then the covered entity directs them on how to deal, how to manage it, or the covered entity responds to it themselves. Whereas here, under state privacy laws, digital health companies might be getting those requests specifically directly from individuals and having to reply to it to the individual directly. So that's one way where you really wanna think about what does your workflow look like? What happens when a patient makes this request, or when a user makes this request, who's getting that request? What are they doing? How quickly are they doing it? Is your data segmented and organized well enough that you can quickly find that individual's information and delete it? Or are you gonna have to sort of search through all of your different databases and figure out what data belongs to this person? Um, and so things like, you know, tagging can be helpful for that. Tagging the data appropriately can be helpful for that. So, so building those efficient workflows I think are going to be really helpful for maintaining compliance. And , um, it , it can be a new way to approach this, but I think that it's, you know, it's helpful and I think it's important. And, you know, in some cases you can use ai, you can use mm-hmm <affirmative> the technology that you've built to make it easy, build it into your model and, and make that process easier for you on the backend. Um, and then lastly, just again, you know, it's important to reiterate that business associates and covered entities, even though we're talking a lot about the gaps in hipaa, that state privacy rules are , uh, are filling, I think it's important to also just reiterate that business associates and covered entities are not always accepted from compliance with state law. So state laws might say, if you're subject to hipaa, you know, comply with hipaa, and then here are a couple of other things you have to do. We acknowledge that you're already subject to hipaa, and so we'll sort of defer to you on that. But it doesn't mean you can ignore the state privacy rules because again, these state privacy rules are so much more specific and strict that there are likely going to be additional things that you have to do that hipaa, what you're doing under HIPAA may not be sufficient. So I think that the key there is understand where the overlap is, understand where the differences are and what applies to you, and then implement a comprehensive strategy that incorporates both HIPAA and applicable state law. And, you know, you might wanna also think about the FTCs rules about privacy policy. So taking the federal landscape and the state landscape and having a comprehensive strategy for maintaining compliance on an ongoing basis, working with a lawyer that is tracking updates and can keep you informed or, you know, even subscribing to some of the news outlets out there that are tracking these things. So, so yeah, I think again, it's keep your documents as broad as possible, build efficient workflows, and then just build a comprehensive strategy and remember that there may be overlaps, but there may also be distinctions and additional things you have to do even if you are subject to hipaa.

Speaker 3:

Wow, that's excellent advice. Thank you so much. Um, so Caitlin , what would happen if a digital health company was a business associate and thus subject to HIPAA with duties to a covered entity, and they were also subject to a state privacy law with data deletion rights? How would they respond if a user , um, came to them and said, Hey, I want you to delete my data, data that might be covered by a business associate agreement and that the covered entity may need to hold onto ?

Speaker 4:

Yeah, so that's a really good question. My clients ask me that question all of the time. What usually ends up happening is the digital health company will notify the covered entity that they got this request. The they will send a copy of the data to the covered entity because the covered entity under state law is going to be required to maintain records for, you know, anywhere from like three to seven years. So that is an , that is actually an important rub between state privacy law and hipaa. Where HIPAA's going to say, I'm sorry, excuse me, state privacy, state consumer privacy law and state medical record law, it's not a hipaa, it's not a HIPAA requirement. The state medical record law might say is likely gonna say the provider has to maintain a copy of a patient's medical records for an extended period of time where the sa where, you know, a different state law might say digital health company, if a individual requests that you delete their records , you have to do that. So what the digital health company can do is they can take the request, they, they notify the covered entity, I'm assuming they're a business associate, they notify the covered entity that they got the request. They say, Hey, under state law we have to do this, but we wanna make sure you have a copy of it so that you can maintain a copy of it within your statutory requirements. Now, in some cases, the digital health company could say, we actually can't delete your data because we have a relationship with your healthcare provider who is legally obligated to maintain this record. And our job under our contract with the provider is to maintain these records for them to comply with their state obligation. And usually the state privacy law will allow for that, right? The state privacy law will say, if this is actually medical record data under HIPAA or under the state medical record law, then you can maintain it for that. It's basically like a user, an individual can request that you delete their data unless a legal obligation applies. And in that case, there would be a legal obligation for the provider to maintain that copy. So it ends up being, you know, between the business associate and the covered entity who's actually responsible for complying with that law is that the business associate has agreed to maintain the records for the provider, or the business associate has agreed to return the data to the covered entity so they have it, but delete the data from their own servers so that they don't have a copy of it themselves. Does that make sense?

Speaker 3:

Yes, that's perfect. Thank you so much for answering that question. Just thinking about that definition of healthcare data , um, as we talked about earlier, so many state laws are going beyond hipaa. So how are digital health companies adapting to that broader definition that you so excellently explained earlier on, how are they adapting to that broader definition of health data under these state laws?

Speaker 4:

Yeah, I think it just goes back to sort of, well, two things. One, what I advise my clients is let's look at the, the strictest rules that are going to apply to you and just build your practices around that. Right? That's good . So as we've said, state privacy laws are in a lot of ways stricter than, than hipaa. That might mean if you know that you are subject to CCPA and hipaa, maybe we look at CCPA and we build your operations and your practices around CCPA, we make sure that it's also compliant with hipaa. But you can usually rest assured that if you're complying with CCPA, you're complying with HIPAA because it's already stricter. So I usually say, number one, let's figure out what the strictest rules are that are going to apply to you and build around that. Because then you're not having to build out different practices in every state or different practices for patients that might be subject to HIPAA and different practices for patients that might not be. You've got sort of that comprehensive strategy that already takes into account the strictest rules and , um, you're sort of acting around that. And that does two things, right? It makes it more efficient to comply because you don't have to have different practices. But I think it also mitigates your risk in the places where the rules aren't as strict, right? If you are complying with CCPA, which is stricter than hipaa, like I said, it's unlikely that you're gonna have a HIPAA violation because you're already complying with the stricter rules . So it mitigates your risk at the same time as making your processes and your compliance practices a little bit more efficient. And then the second one is sort of maybe a , a , a subset of that, which is data segmentation and organization, keep your data organized, keep it segmented so that you know, when someone asks you to delete their data or when someone asks you for , uh, to , to tell them who you shared their data with, you can do that quickly. And you're not confused. You don't miss things because what you don't want to happen is you respond to an individual request. That individual goes, whoa, this isn't what I expected. They report you to the ags office, the AG says we're gonna do an investigation, and then the state ags office finds things that you missed, that maybe you didn't give them all the information you were supposed to, or maybe your privacy policy doesn't say that you were doing this thing with the user's data that you were actually doing. So, so, you know, building that compliance into your data practices, keeping things organized, segmenting appropriately, making sure you know where everything is, back up your data so that if you have a breach, you have a copy of it or maybe a couple copies of it. Um, so, so you know, just staying organized around what you're doing with data, how you're using it, and making sure again, that you are building those efficient workflows around that.

Speaker 3:

Thank you. That's really helpful. Um, so this is a lot to take into consideration and it's definitely a great idea to, to find someone if, if, if it's possible to that can walk you through it. Um, so with all this that we've discussed , um, what are some challenges, some operational or technical challenges that digital health health companies might face when they're trying to implement these state specific privacy requirements? Um , we've talked about that data deletion, right? That state privacy laws implement. Another challenge could be opt out mechanisms. What have you seen? What are some challenges? How can we deal with them?

Speaker 4:

Yeah, so I hate to just like repeat what I just said, but, but I think the biggest challenge I see are building those workflows around it. So opt out is a great one that we haven't really talked about a lot yet. A lot of state laws require much more proactive opt out mechanisms to allow consumers to opt out of specific uses, uses of their data. So, you know, under HIPAA, where covered entities can get away with a business associate agreement and not have to get specific authorization from patients to share data with a business associate for treatment purposes or operational purposes, the same is not true under state law. Under state law, a lot of times you have to give specific opt-out rights to consumers to give them the ability to say, no, you can't do any of this, any of these things with my data . The only thing you can do with my data is analyze it in a way that the mobile app is giving me output, right? So the only thing you can do with my fertility data is, let me see in my tracker what, what my status is. You can't de-identify this for your own purposes. You can't take it and go train an AI algorithm with it. You can't use it to , uh, track my browsing behavior and give me targeted ads. And, and that is something, again, HIPAA does say, you know, patients have to give, have to provide authorization for marketing purposes to allow their data to be used through marketing purposes, but it doesn't go so far as all of those other things. Again, the OCR guidance on tracking technologies does require authorization to use tracking technologies for make marketing purposes, but the state laws are just broader. And so the biggest challenge is one, I guess actually understanding what those differences are, understanding how it's broader, and then turning that into workflows that are still efficient. What does our opt-out language have to say? Can it just be a check box ? Does it have to be separate or independent of the terms of use that we give them? And in a lot of cases it does more similar to like TCPA for those out there listening who are familiar with the TCPA consent. State privacy laws are more similar to that, where you have to give an independent checkbox or an independent consent that does not, that is not sort of muddied by other terms of use or other things that are relevant to the patient or how they're using the app has to be independent. It has to specifically mention everything you're doing, and you have to give the patient, sometimes the user the, the opportunity to opt out of specific individual uses. And you have to be able to do that on the backend and operationally that can be challenging. 'cause again, if your data's not segmented, if it's not organized, it's gonna be hard to say, okay, this particular user says we can't do this with their data. How do we make sure that their data doesn't get into this AI model that we're, that we're training? Or how do we make sure that these tracking technologies are actually turned off for this user? Um, so I think those are the challenges, right? And again, it just goes back to addressing those challenges by developing a comprehensive strategy, being broad, but still complying with the applicable rules and just sort of staying on top of it, making sure you know what those changes are and , and how you're gonna address them when they come up.

Speaker 3:

Wow , those are great considerations. Thank you so much. All right . So I know we've, we've been able to glean so much knowledge from you. I just have one more question before we go. Um, so looking ahead, what types of trends or potential legislation should all of US healthcare professionals and tech companies be watching out for?

Speaker 4:

Yeah, I love this question. Um, this is what I spend most of my days doing is like, what's coming next? What do my clients need to know to anticipate, you know, what's gonna be happening next month or in six months? The biggest one, and we've kind of talked about this a little bit already, but the biggest one is AI specific rules. Almost every state privacy law that is coming out now is explicitly mentioning AI in some way. And almost every digital health company that I talk to is trying to figure out how to leverage AI in some way. And so paying attention to those specific rules like the Maryland statute that talks about mental health, chatbots, paying attention to what do you have to say in your privacy policy about ai, the type of AI you're using, and what it does and how it works. What are you, what do you have to do to the model that you're building to make sure that it is ethical and accurate and , um, not discriminating on certain users. All of these things are being explicitly addressed in state law in a way that they're not in federal law. And some of that's not, you know, explicitly related to privacy, but I think it's still relevant in the sense that there's just a lot more for you to keep track of and to comply with. If you're in the AI space, which again, most digital health companies are in some way, whether you're leveraging an existing chat GPT model or building your own generative model, or just building an algorithm based , you know, platform or function, there may, there is likely going to be state law that applies to you, that applies specifically to you. So keep an eye out for those AI specific rules, track them as they're changing. There are , um, some policy working groups that are also working on federal policy around this, but it hasn't really made it there yet. So, so the state laws are really where we're seeing more specific mention of ai. And then the second one, we've already mentioned it, and I'll just briefly mention it again, is proposed changes to the HIPAA security rule. A lot of what we talked about are gaps between existing hipaa , HIPAA laws and state privacy laws. There is a proposed rule right now to add a bunch of that stuff to HIPAA to make the security rule more specific to explicitly address evolving technology. And so, you know, paying attention to what those changes may look like is gonna be important for, for the companies that, you know, we as lawyers are advising, or if you're a digital health company, it's going to be important to , uh, to understand what those changes may be. Um, and the last thing I'll say again is like, we dunno what the timeline is on that. That could be six months, that could be tomorrow, it could be next year. We don't know. So no guarantee that you're gonna have to make a change tomorrow, but it's important to know that it's out there and that if it does get finalized, you need to know what you need to do to , to comply with it.

Speaker 3:

That's right. That's right. Well, thank you so much. We'll definitely be looking ahead to those changes in the future. And that wraps up our conversation today about state privacy laws and their impact on digital health innovation. Caitlyn , thank you so much for the excellent insights you shared. I really have enjoyed this conversation. And thanks to our audience as well for listening. We hope you found this episode helpful in advancing your thinking about how to respond to the evolving regulatory landscape. Have a great rest of your day.

Speaker 2:

Thank you for listening. If you enjoyed this episode, be sure to subscribe to ALA's speaking of health law wherever you get your podcasts. To learn more about a HLA and the educational resources available to the health law community, visit American health law.org.