AHLA's Speaking of Health Law
The American Health Law Association (AHLA) is the largest nonprofit, nonpartisan educational organization devoted to legal issues in the health care field with nearly 14,000 members. As part of its educational mission, AHLA's Speaking of Health Law podcasts offer thoughtful analysis and insightful commentary on the legal and policy issues affecting the health care system. AHLA is committed to ensuring equitable access to our educational content. We are continually improving the user experience for everyone and applying the relevant accessibility standards. If you experience accessibility issues, please contact accessibility@americanhealthlaw.org.
AHLA's Speaking of Health Law
Understanding Privacy and Security Regulations in the Exploding Wellness Apps Market
Andrew Mahler, Vice President, Consulting Services, Privacy & Compliance, Clearwater, speaks with Sara Shanti, Partner, Sheppard Mullin, about the legal framework surrounding mental health apps and what is currently happening in the industry. They discuss issues related to data protection, transparency, and sharing, along with enforcement and litigation trends. Sponsored by Clearwater.
New Health Law Daily Podcast Coming in January 2025
Coming in January 2025, AHLA’s popular Health Law Daily email newsletter will also be available as a daily podcast, exclusively for AHLA Premium members. Listen to all the current health law news from the major media outlets on this new podcast! Subscribe Now
Support for A HLA comes from Clearwater. As the healthcare industry's largest pure play provider of cybersecurity and compliance solutions, Clearwater helps organizations across the healthcare ecosystem move to a more secure, compliant and resilient state so they can achieve their mission. The company provides a deep pool of experts across a broad range of cybersecurity, privacy, and compliance domains. Purpose-built software that enables efficient identification and management of cybersecurity and compliance risks. And the tech enabled twenty four seven three hundred and sixty five security operation center with managed threat detection and response capabilities. For more information, visit clearwater security.com.
Speaker 2:Hi everybody. Thanks so much for joining us today. Uh, my name is Andrew Mahler . I'm the Vice President of Privacy and Compliance at Clearwater Security. Uh, and I have with me today Sarah Ashanti . Uh, Sarah , welcome.
Speaker 3:Thank you, Andrew. Uh , pleasure to be here.
Speaker 2:Great. Thanks so much for being here. So , uh, why don't, why don't you give us , uh, a quick sense of your background and, and where you're we're calling from today?
Speaker 3:Yes, I'd be happy to. I'm calling in today from Chicago. I am a partner in , um, the healthcare group at Shepherd Mullen here in Chicago. Although we have a global practice and I really focus on data privacy, the intersection of healthcare and technology , um, AI and all the things that play into valuable data assets, especially in the healthcare and wellness industries.
Speaker 2:That , that's great. Uh , basically all of the, all of the interesting and, and hot topics. Um, well, good. Well, so we're here today to, to talk about , um, you know, under the , the topic is understanding privacy and security regulations in the exploding wellness apps market. Um, but you know, I think you and I are here just to talk through and, and maybe share some information and experiences and, and best practices , uh, related to wellness applications. Um, you know, as, as you and I have , uh, talked before, and actually, you know, you and I presented on a similar topic at the HIPAA summit in January , uh, of this year. Yeah . And , and even since January, I mean, so much has changed. Um, you know, we have some new enforcement actions. We have a lot more scrutiny, a lot more movement at , you know, within regulatory bodies and here in the US and, and abroad. Um, and just excited to hear from you, you know, if you can share a bit about, you know, maybe just the background of, of wellness applications and, and so sort of the , some of the issues that, that we're seeing , uh, percolate up just at a, at a general level.
Speaker 3:Absolutely. So we're at a really interesting moment in healthcare and wellness. I think we've been seeing that since , um, the , you know, the COD , um, public health emergency where there was just an immediate need for digital healthcare to really serve and become operational. And since then, it has just moved at warp speed, which is really exciting , um, for our clients and for the industry. Although, I think, you know, my partners and I probably wake up every morning thinking, you know , where are we today? Because things are moving so quickly. While there's a lot of information and education out there about software and applications, remote patient monitoring that is operated or, you know, under the control of your hospital, your clinic, your providers, or your health insurance plan, there's less information about how wellness apps that are run, just kind of from the commercial perspective, direct to consumer, not necessarily through your provider or your health plan, how those use , um, your data, how those are regulated. And, you know, there's really a lot of value add there, but there's a lot of questions that are growing, and I think really having a lot of extra scrutiny by both regulators and consumers on how this valuable data, it's really sensitive data in a lot of cases, is being used and protected. There's a lot of interesting research out there that despite the growing regulatory framework, which I know we're gonna talk about, what does regulate these apps and other apps that are run in the healthcare industry, we're seeing that there's really a concern that data privacy is becoming worse , um, for consumers and patients. And that the proliferation of these apps and the ability to share and use data, especially digitally, has really impacted the privacy , um, of, of consumers and, and patients at the end of the day.
Speaker 2:Yeah, and I , I mean, I think , um, and there, there have been a lot of, you know, podcasts and thought pieces about, you know, tracking technologies and, and, you know, AI and, and things sort of around, you know, in this spectrum. And, you know, as you talk about moving from covid and, and, you know, just such an interest in health and wellbeing. Um, and then of course, you know , the Dobbs decision and, and a lot of , uh, you know, anxiety around how applications may be tracking information about reproductive care. Um, and, and I think just generally, you know, with, with devices and all , all of the things that we have with us and, and on us , uh, at , at all times , um, there's just a lot of opportunities to, to get some interesting data from people and a lot of opportunities from , you know, for consumers to get interesting information about themselves. And , and I, you know, I I do know, at least for me, it's, it's like I , I , I think very few people are going in and reading all of the, the privacy notices and the , the privacy practices that are posted and, you know, the, the, I guess, statements about how these applications or organizations comply with , um, you know, privacy protections. Um, it , there's just a lot of risk out there, you know, and, and I , I think that, you know, going back to your point around sort of looking at frameworks and regulatory bodies, curious to, to hear, you know, what you're seeing, you know, at , at sort of a governance perspective. It's sort of a, you know, from a regulator's eye, you know, what we're seeing in terms of trying to, you know, manage all of this data and information and protect the privacy of, you know, of our communities and, and, and just consumers more broadly.
Speaker 3:Right? Absolutely. So I think we first wanna just kind of level set that despite some of these apps coming across as , um, you know, a medical device or something that is, you know, really rooted in the healthcare industry, so must be protected as your medical records are. I think we wanna first just draw the distinction between where there is an application or software that is governed by hipaa, which is of course, the federal privacy law that protects health information , um, versus the , the , the information that's run by a non HIPAA entity and therefore might not be protected , um, because it doesn't rise to the level of , um, being a provider service or a medical device under FDA. So first, where we kind of have the apps that are working directly with your provider or offered by your healthcare plan, those are generally protected under hipaa because HIPAA governs providers like hospitals and clinics , um, health plans like your health insurer , um, and those vendors of those entities. So when you have data flowing because of your relationship with your plan or your provider, HIPAA's gonna protect that data and really ensure that only data is used for your treatment, your care, coordinating your care, payment of your care, or some legal purposes. Whereas the other side of the data, the date , you know, the, excuse me, the other side of the applications where there's a really cool, you know , um, meditation app or a fertility app , uh, um, you know, a mental health app, a PT app, all of those can be extraordinarily helpful , um, and, and increase access to care and, you know, overall outcomes. However, those apps might be offered direct to consumer, not through that HIPAA entity, and therefore wouldn't be governed by hipaa. Then we kind of get into the massive patchwork of privacy laws in the us We'll , we'll talk about just the US for now. Although we know a lot of these apps are actually global. We live in a global world and economy , so it, it's, you know, going to potentially impact other countries as well. However, in the US we start to look at state law, which has taken a lot of states have taken a really aggressive , um, step towards protecting their residents data, especially health information and more sensitive data like your mental health data, your reproductive health data , um, substance abuse, for example , um, genetics. And so then we look to state law to see what state law applies to the information that those apps are collecting. The issue of a course is that not every , um, data set then is governed by the same laws, because if you're a resident of California, there might be more restriction, or if your data is held in Washington state, there might be more restricted restriction versus, you know, here I am in Illinois and have Illinois applying , um, to some of the laws collect or some of the data collected here. So we have these state laws, which are getting really interesting. For example , um, Washington State just rolled out a really exciting, kind of, almost like a mere image of hipaa, but it relates to non HIPAA data. So any health information that's collected or stored in Washington state can only really be used. Um, and this is health information, but it's a really broad definition of health information can only be used for the purpose in which it's collected for. So if you are giving your app information to help you meditate or help manage your anxiety , um, watch your diabetes , um, and it's not governed by hipaa. This My Health, my Data Act in Washington state , um, where it applies is going to ensure that that data can only be used for that purpose and not shared further, used further help make future products or, you know, commercialization. Um, and then I'll just note another really exciting law, which is California's Privacy Act, excuse me, Colorado Privacy Act, which just became effective and was just passed , um, earlier this, this quarter. And it's really exciting because it, it talks about a lot of biometric and biological data as well as neural data, which is kind of your brainwaves and the data that can be collected based off of , um, you know, your, your, your , your brain or any sort of periphery. So it's really exciting. Your central nerve nerve system, I think is how they describe it. So it's really exciting to hear that even that information, you know, that's tracking your, your sleep cycles, your, you know, ability to kind of, you know, where we're seeing, you know, brain to screen where you can write with just thinking and things like that, that data will also be protected. So we're seeing the states get really aggressive and then the feds come back in when you talk about things like , um, FTC has section five of the FTC Act, which prohibits unfair and deceptive trade practices. So you can't tell somebody on your app, we're gonna protect your data and then not do it. That would violate FTC, of course, then we have the FDA again, where it does apply, even if HIPAA doesn't apply, if it is a medical device, the FDA is gonna ensure that it's secure and protected as well. So I'll stop there, Andrew, 'cause I know that's a lot to consider and, and all the layers there is what's keeping us up at night.
Speaker 2:Yeah, and you know, I, you , you mentioned, you know, things like meditation apps, and I, I think what's probably also just worth putting, you know, maybe an extra fine point on is, is just the , just the , the sheer volume of, of, of applications that are out there. And, you know, you and I were kind of sharing information about the Mozilla study from a couple years ago. Um, and one thing that was interesting to me, it's, it's like, you know, I I sort of understand that something like, you know, a fitness application or something that's tracking, you know, heart rate or my diet, what I'm eating, I mean, those things, it's, it's pretty clear that there's personal data that is, you know, I from, you know, as a , I'm volunteering my data into these applications essentially, but I , I was interested to see that a lot of these kind of meditation, mindfulness apps , um, had some pretty bad scores from Mozilla in terms of privacy practices. And yeah, I think, you know, as you're sort of pointing out, I mean, there's, there's a variety of, of , um, sort of regulators that are working through how to protect this at the same time as this proliferation of applications as people are thinking about new and in innovative ways to, to connect with people and, and with our bodies and with our minds and, and all of that. So it's, it's, you know, very complex. And I'm guessing from your perspective, you probably have a lot of confu confused clients and , uh, just interested, you know, to to hear maybe just a , a bit about how you help them, you know, after setting out, you know, all of those things that you mentioned in terms of all of the different data protection rules and laws that are popping up. You know, how do you, how are you sort of helping your clients navigate, you know, all of these complicated issues around applications and , and enforcement and, and even just litigation risk?
Speaker 3:Absolutely. I'm so fortunate because I get to support everything from, you know, we just, you know, incorporated yesterday and we're a brand new startup and have this great novel idea, how do we get it to market all the way through, you know, some of the, the large tech companies out there, the established academic medical centers and healthcare providers, and then the investors, right? We know there's a huge interest of investing into some of this exciting technology. And so what we first kind of try to walk through is, you know, what does the law say, right? What laws apply so that you can be compliant with those laws? And not only is that important for legality, but also, right, it is gonna be good for your pr, it's going to be good for when you go out there and try to get new patients or consumers or investors, right? You wanna show that you're sophisticated, you understand what the legal framework is, and you're abiding by that. Um, and, and that's not always an easy question , um, or an easy answer because a lot of times there's , um, more than just one stream of data, and HIPAA does recognize that your whole entity might not be governed by hipaa, right? You might be doing lots of different things in the world, not all of it in healthcare or not all of it, that's protected under the HIPAA guidelines. So we try to first kind of data map, right? Where are you getting data? What do you wanna do with it? What laws apply? How can we either silo that or aggregate it , um, to serve the business really well while doing it legally and ethically? Then we kind of talk about, all right , once we know what the law is, let's really navigate the point that your data is an asset. It likely has a lot of value , um, not only to you, but potentially to, you know, third parties. Um, so how do you wanna use that and leverage that again, legally and ethically ? Um, but we're finding that there's a lot of evidence that , um, well, first of all, Andrew , I know we've talked about this before, that the attrition rates of using a new app fall off, re fall off really, really fast, right? I think we've seen something like 80 to 90% of new users , um, will be, you know, no longer a user after 30 days, right? So the, the attrition is really, really interesting. But we are seeing that there's some great scientific studies out there about where consumers and patients trust the app and really understand , um, not having to read, you know, tiny little, you know, text for pages and pages, but really understand how their data's gonna be used and feel assured that they're in control. Um, that doesn't mean that they can't con consent for there to be greater use, but that they really have kind of a trust in the application that the usership , um, is much, much higher and much, much longer. So there's really a business case for this as well, not only to show your investors and your ultimate consumer that you're compliant , um, but also to really consider how to keep those investors, those consumers, those patients , um, protected and, and part of your, your business long term .
Speaker 2:Yeah, I mean, I think that's, it's, you know, it's , uh, really good point you're making , um, you know, around transparency and, and information and how, you know, organizations and applications are able to communicate back to consumers what they're doing and, and, you know, the fact that, you know, they take, you know, not just rules and laws seriously, but they take some risks around, you know, our data seriously. Um, I, I guess another just something that I, I've, I've been thinking about as we've been sort of talk , we've been talking through this, you know, I, I don't know that everybody reads the privacy notices, right? Or even even the, the terms and conditions or, or the user access agreement. You know, it's, I think we've all gotten pretty numb to that. Um, I'm just sort of interested, you know, as you're talking with clients, you know, and they're interested in, okay, we, we, we've got this really great application, or we've got this really great idea or concept, we do wanna be transparent. Um, we do want to provide information about our ethical practices, and at the same time, we don't want to, you know, overstate what we're doing or what we're not doing because, you know, frankly, we might, we might wanna partner with third parties down the road, and we might wanna find ways to, to build a business model around the data. Just interested to hear, you know, how are you helping them navigate some of these questions around transparency and, and information.
Speaker 3:It's an excellent question. And of course, you know, what's keeping us busy these days is there's not just one answer. It's really kind of walking through. Um, you know, we do it in different ways. We have, you know, for those clients that are , um, sophisticated or really independent, we give them a questionnaire saying, tell us where your data is. Tell us what you wanna do with it. Tell us your long term and short term goals so that we can navigate policy around, you know, the business model as opposed to make the business model, you know, conform to policy. But we try to really show some very high guidelines. Um, you know, we a lot of times wanna talk, you know, walk them back to the most restrictive law that will likely apply and build off of that. And, and, you know, read out kind of the risks to the business , um, what this could mean, you know, if they have a breach or they misuse data. Um, we've seen massive settlements both from the class action side where there's been an application or a website or other software that's been tracked , um, by third parties, whether it's an advertiser or a social media company we've seen where health information is being tracked by a third party. Um, and the user didn't know that , um, that these entities are getting, you know , um, sued at the class action level. They're having federal regulators come in, they're having , um, state ags , um, uh, uh, attorneys general come in, and it's really, you know, disrupting their business. And also, you know, has a huge reputational piece to that. We, I I imagine everyone listening is, you know, very familiar with ransomware attacks, which are so prevalent, not new, but definitely more , um, I think they're, they're becoming more public more often. And, you know, we see how that ki that's being kind of weaponized against, you know, data is becoming , um, you know, the , the item that's being weaponized against , um, a lot of, you know , um, a lot of tech companies, a lot of healthcare entities, but there's a lot more to this weaponization than just ransomware. Give me your data or else, or , or, I have your data pay me or else , um, there's, you know, we, we see, actually, we won't go too deep into this, but there's cfius , um, you know, regulation and, and different other different kind of , um, at the state, state department level, and at the kind of national security level of we wanna make sure this data is really protected, especially our sensitive data. And , and there's been a recent executive order , um, on the ability to send sensitive data overseas because we know that that can be used , um, and, and, you know, used and built to weaponize that back to a certain population, whether it's understanding where there's genetics , um, you know, vulnerability, understanding where someone has a personal vulnerability that can be used, you know, to blackmail them , um, or just , um, you know, kind of create havoc, right, in people's personal lives. So we hear a lot, I have nothing to hide, but that might not be the issue. It's that you don't have anything to hide, but with enough data that's being pulled and tracked and shared and permissibly that can really undermine a community , um, or, or a nation. We're also seeing where this is a gen , like a, a generational issue. I think there's a lot of really interesting research out there about how gen alpha , um, you know, which are a lot of teenagers right now , um, you know, they're so savvy that they have taken a real view on we're okay to be out there, we're okay to share, but on our terms , um, you know, things like , uh, children now are a lot more likely to ask their parents to get their consent before they're posted online, right? They're saying, mom, don't post me unless I've consented. Or , um, I had a toddler the other day in my presence say , no photos unless they consented, right? So they understand that they wanna make the decisions, including on these apps. So that just builds into the business case of you're, you're not gonna get those new users , um, if you can't really show that you understand the law and you're willing to respect , um, the data that you're aggregating and pulling .
Speaker 2:Yeah, it , I mean, that's, that's a really interesting point. I mean, I think that that generational shift is , um, is, I think we're feeling it , um, pretty, he , you know, heavily these days, particularly around ai and, and I mean, even conversations about, you know, is, is your AI model, you know, is the data being used to train AI models? How is it being used to train? Um, you know, and I think there has been a bit more of a discussion around, you know, what we're actually providing to , uh, to these organizations, these companies, whether it's AI or whether it's an application or a com combination of the two. And I, I do see that there's, there seems to be much more of a robust discussion, you know, even at just a sort of a general public level around, around this, you know, specific to ai. But I think just around this more broadly too, I , I, you know, I , I saw a , I think a mutual friend of ours , um, had posted something on LinkedIn the other day about , um, about just how challenging all of this is , uh, and that this, you know, this work, you know, organizations, you know, spending thoughtful time and energy and money, frankly , um, to, to try to address these issues. It's, it's, it's hard, right? I mean, it's, it's, it's not easy. It's, it, it can , um, it can be stressful, it can be aggravating for, you know, particularly, you know, businesses that are going through a due diligence process. I mean, it can be , um, it can be a lot of work, and it's tough. And so I really like your, your comment about, you know, sitting, and maybe this is just very basic, but I, I think just reflecting back, you know, sitting down with your client and just asking them, you know, what , what are your goals and objectives? What are your fears? And then I think, you know , as you were kind of outlining, it's, it's almost like using a, a checklist to sort of go through and, and help them understand, okay, there's data coming from this source, you know, that could implicate these risks. There's, you know, you, you're standing up a , you know, a , a , you know, whatever it is, a server farm or servers in this particular country, here's those risks. Um, you wanna partner with these third parties, here's risks associated with that. Um , and, and sort of helping them think, think that through. And , and I think part of which, of course, being that transparency sort of clarity piece, you know, how do we want to, you know, how do we want to put this information out there in a way that, you know, gen alpha or, or you know, you know, really anybody who's, who's nervous or curious about how data's being used, you know, how can we put this information out in, in a way that's clear to them and helpful? And at the same time, you know, we don't get caught in some of these FTC um, enforcement actions because our statement's been overly broad, or we've said, you know, we've got the best security, security practices of all time, and we don't do anything with your data. You know, back to your point, you know, we're an ethical company. We, we don't have anything to hide. We're , um, you know, we're doing, you know, we've got the highest level of controls, you know, only to find out, well, you know, maybe there was some exaggeration there. And so, you know, I know for us, you know, you're, you're supporting clients on the front end and, and, and the back end of these issues. And we sort of find ourselves as consultants kind of in the middle helping them, you know ? Mm-Hmm , <affirmative> think , think this through , um, and , uh, and , and really come up with what, what they feel is the right approach in terms of, of just sharing information. So I don't really know that's, there's a question, question in there, Sarah , but just curious, you know, if there's anything, you know, that, that raise thoughts or, or questions from you as we're sort of talking this through.
Speaker 3:Yeah, no, those are excellent points. Um, y you know, I, I kinda always get the knock on the door because people think it's a HIPAA question. And , um, mm-hmm , you know, I came from HHS years ago, so, you know, HIPAA is kind of my bread and butter. But more and more I find myself saying, HIPAA's not the problem. Or HIPAA is the easy part here, right? We need to navigate the, the state laws, the, the other , um, you know, the FTCs, the FDAs of the world. Um, we need to make sure that we're navigating those and that we're really thinking about your business. Um, we've seen some really exciting, and, and, you know, sometimes that sounds like it's an impossible task and it's really not. Right? We do these all the time of, tell us what you really need data for , um, and what you wanna do with it. Sometimes we hear we need to sell it, right? Sometimes we, we hear that like, this is a huge asset that has a lot of value. We need to find a way that we can share it or sell it. And there's ways to do that ethically too, whether it's, you know, meeting a de-identification or anonymizing the data in some cases, getting consent, getting opt-ins. It's really amazing, especially with patients , um, when you ask for them to share if they're willing to share their data for things like research or analysis or improvement, it's amazing how many patients wanna be part of a solution or to improve healthcare. So sometimes it's really just, it's not a no, it's we're just gonna ask first , um, and then we'll use that data , um, or it's not a, no, we just need to strip, you know, certain identifiers out of it, or it's not a no, but we need to, you know, only use it for this one purpose as opposed to 15 purposes and 14 of which are not really relevant to you anyways. So there's, there's lots of ways to just kind of build in policies so that your workforce knows what they're required to do or not do. And then to , to , to make sure that, you know, your, your consumers and the public knows what you can and can't do, and how they can opt in and out , um, and, and, you know, the consumers are becoming more savvy. So I think that's important for the industry , um, you know, the industry to know. We're also seeing the industry get more savvy. Some really interesting things like , um, ethical data mining of we'll just go, you know, our terms of use said we couldn't do that, but now we're gonna go back to our consumers and say, are you interested in, you know, kind of donating your data, or we'll pay you some sort of fee or give you, you know, something , um, in reimbursement to use your data. Of course, there's other healthcare regulations that might apply to that, but in the consumer space, we're seeing some of that get a little bit more savvy of, well , let us just pay, you know, buy the data from the consumer as opposed to using it without permission and them potentially getting in big trouble. And when I say big trouble, this can be a business ending issue, not to be the alarmist, but we have seen where courts have said, wow, you created an remarkable algorithm, or you created something off of ai. Um , and that's amazing. And you know, it's probably very valuable except for you never had the rights to, to the data that you built this algorithm off of, or you built this, you know, AI product off of, and therefore it's not just a , you know , um, a little slap on the wrist or a fine. It's, we've seen where courts have said, you need to destroy, you know, that entire product and never use it again. Um, so it, it's really not a , we'll pay the fine, you know, we'll ask for forgiveness later. We're seeing where courts are getting more aggressive saying, you have no rights to that. You know, you ultimately went into someone's house and stole their property and tried to resell it and, you know, kind of from, you know, like a digital perspective , um, analysis and , um, you know, you, you just can't use that at all. So there can be some real stakes here. Um, but it's also something that, you know, folks shouldn't be afraid of. We see a lot of clients say , tell us what we need to do and we'll do it. Um, and, you know, that's actually more efficient and cost saving than for them to kind of noodle about it for years or months , um, to be proactive to, you know, understand that their assets, their data is an asset. And to not just give that away either. Andrew , as you know, and I know you do a lot of work in web tracking in some of the, the cookies and some of the other ad tracking that we've seen make huge headlines in the health healthcare sphere and where, you know, there's a , there's huge pushback on that, not only litigation , um, but, you know, having that data proliferated out there, you know, can really impact not only an entity's reputation, but really tragically, you know, hurt their patients because their patients might now have lost trust and not come back with full information. We see this a lot where providers are saying, there's so much that we're missing from a patient. If we knew the whole story, if we knew they were being really honest and trusted us, their outcome might be better.
Speaker 2:Hmm . Yeah, that, I mean, that's a really, really interesting perspective, and I , I'm thinking you , it's sort of reminding me of like more traditional, like research studies and clinical trials where you, you, you have similar cases. I mean, going back before wellness applications where, you know, you've had an IRB or, or , um, or another group come in and, and say, you know, all of this research, you know, that you've done is great research, but you've, you've gotta get rid of it because you didn't get the proper consent, or you didn't, you know, you did, there was a step that you missed along the way and and you can't use it for, for research. And I, I think there's probably some similar lessons maybe somewhere in there if I were smart or creative enough, I could <laugh> we could maybe solve it . But, you know, I , and another thing that I just thought I'd throw out there too, and I don't think you and I have had a chance to talk about this, but something else that you mentioned, just in terms of the transparency component, just as we're sort of maybe winding things down a bit , um, I, I was, I was doing some work around , um, around tracking and, and cookies, as you mentioned for one of our clients, and they were asking about, you know, do they really need to have these sort of more advanced or complex opt-in opt out , um, you know, popups on their website, you know, because we're seeing, you know, there's just so many differences in websites and you see with applications too around, you know, are you opting in? Are you opting out? And then some websites, you know, give you choices and you can pick from those choices. And I , you know, in doing some research, I stumbled on this interesting , um, note that came out of a recent , um, California Privacy Protection Agency group meeting , um, where they were talking through the, the sort of advanced cookie, you know, notices. And something that sort of stuck out to me and some of those notes was there was, there seemed to be a discussion that, you know, maybe some of those more advanced cooking, you know, notices aren't very helpful because you , you end up, you know, you just want to go to a website and you end up spending five minutes clicking boxes and, and, you know, it's, it's almost like two , you're getting too involved in the <laugh> in the , in the data collection process as just a , you know, a general browser of a particular page or a , an application. And so I , I , that sort of stuck out to me because I thought, you know, in this sort of world around, you know, these conversations that we're having around greater transparency, greater information, greater clarity , um, there's, there's some examples where maybe it's, you know , the , the information that we're providing, it's, it , it may be too much and it may not be helpful because of the way it's being presented. And , um, I , I just, I , I don't know if you've seen any similar conver , you know, thoughts or conversations, but I, I thought that was, you know, that , that certainly stuck out to me as sort of a different, you know, potentially a different sort of thought or approach. Um, and then I guess just sort of in closing too, I mean, I welcome your thoughts about that, but, but also, you know, any, any just sort of general parting thoughts for people, you know, attorneys listening or, or folks in the compliance profession trying to, you know, help help calm the anxiety of their clients and, and help them think through some of these challenging issues?
Speaker 3:Yeah, absolutely. I, I think that maybe I'll just end on some of the numbers that I find fascinating , um, to kind of drove high , drove drive home the point here , um, you know, there is a statistic that, you know, one out of every two babies , um, born in the US are on , um, Medicaid. And there's also another statistic that, you know, over 80%, maybe it's even over 85% of Medicaid users , um, ha use, you know, their mobile device. Um, and , um, I think 50% of that, or 50% of, you know, all healthcare users , um, I'll have to double check myself on that stat, but a huge number of, you know, all of those users with a mobile device use some sort of wellness app. Yet there's, and I , I think Andrew, we did talk about this, I think it's like, you know, 14 or 15% of those polled kind of trust their app , um, or trust, you know, the , the tech site that they're on. And so I think just, you know, looking at those numbers of the massive amount of individuals who want access and want, you know , to have that use but don't trust, you know, the offering. Um, I think at the end of the day, that's what the federal government, the state government, and all this litigation is really meant to achieve, is how can we make sure that these offerings are protected so individuals can trust it? And then we have a real healthcare solution here, right? Because you're gonna get good data, you're gonna have people engaged , um, and, you know, hopefully, you know, we're managing some of these massive chronic diseases , um, and keeping people well longer. Um, so at the end of the day, I know sometimes we get caught up in the law and all the regulation and we kind of missed the point of why are we doing all this? But I think when you take that step back of, if we can really protect data and build that trust, you know, this healthcare solution is, is, you know, gonna be even more valuable.
Speaker 2:Gosh, yeah. I, I, I , I don't think there's a better way to, to end the conversation. I mean, I , I think that's, that, that's really important to, to emphasize, you know, as you're talk , as you know, all of us and listeners and others are, are talking with our clients and helping them understand , um, you know, the , the value of, of the, the, the thoughtfulness that that goes into really helping, helping to manage data, protect data so that we're getting information that's valuable and, and we're ultimately protecting our users and consumers too. Um, well thank you so much for joining Sarah , and, and , uh, really, you know, always a pleasure talking with you and, and, and working together and, and hope to, hope to be back with a HLA sometime in the future and talk about other interesting things. So thanks so much and, and thanks everybody for listening.
Speaker 1:Thanks Andrew. Thank you for listening. If you enjoy this episode, be sure to subscribe to a HLA speaking of health law wherever you get your podcasts. To learn more about a HLA and the educational resources available to the health law community, visit American health law.org .