Spring Parker Accelerating Health Care Performance

Esmond Kane, Chief Information Security Officer, Steward Health Care

Featured Guest: Esmond Kane

What he does:  Esmond is the Chief Information Security Officer of Steward Health Care and has over two decades of experience leading IT and security programs and safeguarding vital sectors in multiple industries.  At Steward, Esmond’s focus has been on transforming Steward's approach to information practices security, threat and risk management to comply with industry frameworks, and regulations and best practices. Prior to Steward, Esmond was Deputy Chief Information Security Officer at Partners Health Care working with executives and advisors on cyber security and business practice. Esmond has held multiple IT and security roles, including at institutions like Harvard University and Mass General Brigham, and he also serves on the advisory boards of multiple companies, providing valuable insights on cyber matters, ensuring secure IT operations, regulatory compliance, and resilient design.

On risk:  "Risk is pervasive and it's across the continuum of the delivery of health care … Health care is all about securing the patient. It's all about making sure that we lower any risk on the patient privacy being impacted, the safety of the systems that we're putting in front of those patients. But also in this modern era we have to be very aware of the potential attacks on corporations on health care … Something relatively benign, like a medical device that you're plugging into a patient, you can magnify the risk associated with it because some of those medical devices require updating, they require securing, they're also storing sensitive information … We need to know where our data is, we need to know how our assets are managed … Ultimately, what you're trying to do is to measure that risk and make sure that it maps to your organization's risk appetite. And you're trying to mediate, you're trying to take what could be a high risk and what you're left with from a residual perspective, after you put some effective controls in play, it becomes much less of a risk."


Scott Nelson  0:01  
Welcome to The Risky Health Care Business Podcast, where we help you prepare for the future by sharing stories, insights, and skills from expert voices in and around the United States health care world with a mission to inform, educate, and help health care organizations and individuals, ranging from one doctor practices to large integrated systems and organizations throughout the dental, medical, and veterinary health care industry with risk, while hopefully having some fun along the way. I'm your host, Scott Nelson, a guy that grew up in Ohio and has been working all over the United States during my 20 plus year and counting career in the health care industry, with a commitment to accelerating health care performance through creativity, not just productivity. Let's dive in.

Imagine this: you’re in a rain storm holding an umbrella trying to keep dry from constant, nonstop rain. The wind shifts and rain comes from different directions.  The umbrella does a decent job as a defense against the rain but it doesn’t stop all the rain drops so you get wet.  You were prepared for rain because you watched the weather report and had the umbrella but it wasn’t 100% protection.  Maybe you go buy a rain coat for the next time, and rain boots, and a rain hat so you have better armor and protection against the rain because this time the umbrella didn’t keep you totally dry.

Take that situation and change it. Instead of you in the rain holding an umbrella, think about a health care organization – a large, integrated health system or a 1 doctor practice – dealing with nonstop rain, all day, every day and trying to keep dry. But instead of rain it’s cyberattacks.  Instead of shifting winds it’s connected IT and computer networks, staff opening attachments, 3rd party vendors that are trusted and known.  Instead of an umbrella it’s platform technology solutions, education, and training, like phishing scams, strong passwords, and multi-factor authentication.  And the attacks are nonstop, all day, every day.  

There’s no break from the rain these days for any health care organization no matter of size.
These days are no longer drills—it's the current environment of health care cybersecurity. The HHS Office for Civil Rights has sounded the cybersecurity alarm revealing a chink in our digital armor with more than 540 data breach notices in 2023, affecting potentially tens of millions of people. Fierce Healthcare noted in December 2023, as cybercriminals cast a wider net, no health care entity, big or small, is untouchable. Large health care systems may have fortified their defenses but those still encounter daily attacks.  Smaller providers are the newest prey on the block, vulnerable and exposed. And the danger doesn't end here; the tentacles of information security threats have entangled third-party associates and supply chains alike, amplifying the risk.

Amid the financial crunch and the unsettling possibility of state-backed cyber adversaries, health care systems are in an arms race against time and pressure—an age where it's not about 'if' but 'when' a cyberattack happens. Forbes noted in December 2023 that with a staggering average cost of $1.3 million per cyberattack, it's high time for the health care sector to escalate cybersecurity from the sideline to a cornerstone operational strategy.

In this episode I'm speaking with Esmond Kane about information security risk in health care. Esmond is the Chief Information Security Officer of Steward Health Care and has over two decades of experience leading IT and security programs and safeguarding vital sectors in multiple industries.  At Steward, Esmond’s focus has been on transforming Stewards approach to information practices security, threat and risk management to comply with industry frameworks, and regulations and best practices. Prior to Steward, Esmond was Deputy Chief Information Security Officer at Partners HealthCare working with executives and advisors on cyber security and business practice. Esmond has held multiple IT and security roles, including at institutions like Harvard University and Mass General Brigham, and he also serves on the advisory boards of multiple companies, providing valuable insights on cyber matters, ensuring secure IT operations, regulatory compliance, and resilient design. Let's talk with Esmond about information security risk in health care.

Esmond, welcome to the show. 

Esmond Kane  4:02  
Hello, Scott. How are you? 

Scott Nelson  4:04  
Doing well, thanks. I appreciate you being here. 

Esmond Kane  4:07  
Not at all. Thank you. 

Scott Nelson  4:09  
Before we begin our conversation about information security and risk and health care, I'd like to take a moment and talk about your background and work. What is your work and how did you get to where you are today as a chief information security officer?

Esmond Kane  4:20  
Thank you. I've taken a circuitous route towards becoming a CISO. I worked my way up from IT all the way from HelpDesk and DeskTop to IT and then about 15 years ago took a pivot into security when it was starting to become a more formal discipline. I've been very lucky to work in a lot of the premier corporations here in Massachusetts. So I've worked for Harvard, I've worked for Partners HealthCare now Mass General Brigham, currently I'm a CISO for Steward HealthCare, but I will just say that today any opinions I express are my own and don't necessarily reflect on my employer.

Scott Nelson  4:56  
A good place to start would be what is information security? What does that mean and include?

Esmond Kane  5:04  
It's a good question. And I often start conversations with my peer leaders or the board or other executives and I asked them how they think about security. How do they think about risk, especially in health care. Health care is all about securing the patient. It's all about making sure that we lower any risk on the patient privacy being impacted, the safety of the systems that we're putting in front of those patients. But also, in this modern era, we have to be very aware of the potential attacks on corporations, on health care. Ransomware is a pandemic that is absolutely wreaking havoc across the health care sphere. But those kinds of information security issues, they can be sourced from an error, it could be an accident, somebody misaddresses an email, it definitely could be cyberattack. But we also do deal with continuity issues associated with disasters, electrical grid issues, a lot of climate issues. Nobody wants their hurricane to impact your patient safety, but it's definitely a risk that we consider. But also I need to be aware, Scott, when I talk to my colleagues in the clinical space, they're very comfortable talking about risk because patient care involves risk. How you address a patient, or surgery, or a particular regimen of pharmaceuticals, that that can carry some risk of harm. But when we think of health care, we often think of that truism that the goal is to first do no harm, you need to kind of double that potential for it to be a double-edged sword.

Scott Nelson  6:45  
Along those lines with that double-edged sword in the clinical care and the technical piece, why is it a risk then or what makes it a risk? There's always been this comment or this thinking that technology is supposed to make everything easier, but sometimes it seems like it just makes things more complicated and dangerous. Is this just that there are bad actors in the world trying to do bad things to vulnerable people? Or is it something else?

Esmond Kane  7:09  
I think the bad actors, they certainly like their headlines, and they're certainly able to monetize some of the attacks. But no, to answer your question, there's also just inherent risk with the delivery of patient care. The systems that we work with are tailored to health care, there perhaps require a lot of care and feeding. I often think of the truism that when I approach a clinical colleague and I tell them that their MRI that I want to patch on a monthly basis sometimes the vendor might withdraw that patch. And, you know, they're very much interested in taking care of patients, you know, maximizing the number of patients that will go through those systems. And if I turn up and say, well, I got a patch, I've got to make sure that it's safe from cybersecurity threats. And, you know, again, this is part of me introducing potential [inaudible] because the device may not be available. But also when we think about these smooth functioning this, this issue with delivering care, based upon this slowly digitalization of health care, we have to be aware that the bad guys are sometimes at the forefront of adopting some of these technologies, for instance, AI right now. So as we're trying to secure how our clinicians, our physicians, are using these IT devices, we also need to be aware that the bad guys can turn that around and, and use it to impact the patient care, they can use it to shut down your operations. And they can steal data, they can disrupt the systems that we're using to deliver that care. They may just demand and extort funds. And the worst case scenario is we, you know, in health care, you end up having to divert patients, you end up having to make those hard calls. It's, it's really part of this dual pandemic that when we needed health care to be the best for COVID and after all of that COVID it's never been under such dire attack by these malicious individuals. But to answer your question directly, there's also just a lot of care and feeding associated with the the operational maintenance, the billing, the financial aspect of delivering health care.

Scott Nelson  9:31  
Let's talk about why it's important, you get into the why there, why is it important to plan and prepare for risk in information security? Earlier this year, USA Today had some cybersecurity statistics and cybercrime findings that showed that in 2023, the average cost of health care data breach was $10.93 million, higher than any other industry and over double the average of all industries, which are around $4.4/5 million. The health care industry also has seen a 239% increase in large data breaches reported to the HHS Office of Civil Rights over the last four years. And according to the US Department of Health and Human Services, there has been a 60% rise in the number of people affected by these data breaches in 2023, almost four and five are caused by cyberattacks and the most commonly reported cause was malicious attacks. So to you why is it important to plan and prepare for risk?

Esmond Kane  10:24  
Well, I think good patient safety is also good cybersecurity. I think we're all trying to avoid those headline grabbing incidents where a health care organization gets attacked by ransomware. But we need to understand that ultimately we're here to deliver the best patient care that we can. And that can have some adverse outcomes. There are a lot of comorbidities associated with the delivery of health care because the human body is also very complicated. But when you add in technology, you add in all the administrative overhead, and then you add in the fact that bad guys are also attacking it. Something relatively benign, like a medical device that you're plugging into a patient, you can magnify the risk associated with it because some of those medical devices require updating, they require securing, they're also storing sensitive information. So we've seen this uptick in malware, we've seen this identification of health care as being a profit center for criminals, because they can and because there's also a lot of nation-states involved in these attacks that we're seeing. The reason that HHS and CISA (Cybersecurity & Infrastucture Security Agency) has identified this need for redress in health care is because a lot of health care needs to step up here. There are common performance goals that perhaps aren't where they need to be to combat a nation-state, it's, it's almost absurd to think that a hospital system which is already a very complicated environment to work within, now they have to deal with the nation-state level of sophistication, or a semi-state nation-state sponsored attack. And ultimately, here, we're trying to make sure that the patient is getting the best care, their their implantable, their device their, if they're a resident, or they're, they're going through some element of a surgical procedure, we want to make sure that we minimize that window when a patient can be harmed. And when these bad guys are attacking us that window can be magnified. But also, some of those attacks you described earlier, they're not directly against hospital systems. Now the bad guys are also going against our supply chain. They're going against our insurers. So that you can't file a claim, you can't process a payment for treating a patient. It's this kind of chaos and disruption that's benefiting certain nation-states. And it's also causing immense harm to the American health care sector. It's it's almost tantamount, Scott, to a perfect storm.

Scott Nelson  13:07  
And that segues perfectly into my next question, and I was thinking in terms of the where the risk and vulnerabilities are. So there are many areas where businesses can be vulnerable to risk, such as third party risk management, you had mentioned patching an MRI and an MRI system, and vendors, cybersecurity education, responsive cyber threat landscape, identity and asset management. Where is information security risk? What are the areas there that are or should be of concern?

Esmond Kane  13:34  
It's certainly, obviously there's a technical aspect of this, but there's also the patient harm aspect to this, right, there's the operations, the potential to impact the finances, there's also the reputation one, it's its entire spectrum of the delivery of health care is is under siege by these bad guys, you know, they're also to your point, they're going after the people aspect, they're going after the process aspect, they're going after the technology. Anywhere the bad guys can identify weakness they're targeting. So a lot of what I do is certainly educate, to your point, but that education isn't just the people involved in the delivery of care. It's also the executives. It's also the third parties. It's also the vendor relationships that we've established. I'm a big fan of establishing a common ground, a risk management framework, which [inaudible] some kind of industry standard. I'm a big fan of using that standard to educate and targets relatively simple things like vulnerability management, being proactive, educating the consumers, also, just educating the practitioners, the people who are involved in maintaining these systems on, you know, what does it mean to fend off a business email compromise? What does it mean to have a system that is resilient? And how do you, how do you continue to provide care when the risk of a business impact from a cyberattack is increasing year over year? And a lot of it is just it's a lot of the same controls we're hearing out of our colleagues and CISA for decades, you know, I'm certainly willing to go into those in depth, Scott, but, you know, really want to just answer your question, the risk is, is pervasive, and it's across the continuum of the delivery of health care.

Scott Nelson  15:40  
The news alerts to big organizations that have experienced a cyberattack or a cybersecurity incident, but I can also pull a data breach report from HHS Office for Civil Rights and see all types of organizations that have experienced an event. I would imagine small organizations are more vulnerable to cyberattacks, to other types of technology issues. Is one entity type more susceptible than others if I'm thinking about a one to two doctor practice versus a large, integrated health system somewhere in the US?

Esmond Kane  16:10  
That's an interesting question. I mean, realistically speaking, if you're connected to the internet, you're going to be under attack. The bad guys don't care. I don't believe it's any more entity is any more susceptible than any other. Obviously, there are targets that are better monetized, perhaps it's more susceptible to what they call big game hunting, because they know they can get 10s of millions, if not more, out of their cyber insurance claim if they are successful in their ransomware, in their extortion, but they're also creating opportunities in the smaller players. When you look at some of the results of reviews by HIMSS and HHS and others, there's smaller health care entities, small physician practices, that may not even have a dedicated cybersecurity practitioner. So it's, it can certainly be harder in those environments to protect those. But the bad guys are literally roaming across the entire health care sphere. Farming insurance, practitioners down to psychiatric practices, therapeutic practices, they're hitting gyms, they're hitting everywhere, where you're involved in looking after the human that's involved in these issues. So it kind of behooves HHS to define these cyber performance goals, to talk about the simple things that everyone can take, you know, so everyone can pick a good password and not reuse that password. Everyone can enable step up authentication. Most people know how to maintain a system and some people certainly should be looking to secure and harden those systems. Some of this you can rent from from MSPs, and various other practitioners, some of it, maybe you have dedicated in house staff to get it done. But a lot of what these guys are targeting is common weaknesses that are enumerated across all the parties involved, all the entire health care vertical. They're hitting everybody and obviously they're making more money from the bigger guys but they're still making money out of the small guys.

Scott Nelson  18:31  
I'd like to go back and revisit and talk a little bit more about your approach. And when you talk about education and a risk management framework and vulnerabilities management and thinking about this your approach in a reactive versus proactive thought process, and what to do and how to plan and prepare whether or not you are that one to two doctor practice or you're a large integrated system, organization, the size and type that you work for, have worked for. I have a personal experience that I'll share as an example that highlights this is not just tech solutions installed on a business's platforms and systems. Back when I was just out of college, about a year or two out, there was this, it was a nationally known virus, it was called the "I Love You" virus, and I had an office job at the time, actually more of a cube job, and walked in to work that morning, email had only been probably a few years old, so email came out when I was in college, and walked in that day and got an email from a friend of mine that I had grown up with since I was a kid. And the subject of that email said "I Love You" on it with a Word document attached. And this was well before anybody knew about phishing scams and those kind of things. Well, I clicked on it and opened this thing up, and then all of a sudden across the floor you started hearing people like their computers are having situations or issues. And then probably about an hour or so later somebody came up to my desk and asked me what had happened because the entire building, this was like a this was a sizeable multi-story building in Columbus, Ohio, shut down. The system wasn't working anymore. So thinking about, you know, there's still things that, now I think that from what I've read of, you know, phishing scams aren't as prevalent today as they were back then and there's more education, but how do you approach risk from the education and that risk management framework? And what can individuals and groups do to simplify that information security and make it manageable?

Esmond Kane  20:26  
It's great question, Scott. Because I think one of the things the bad guys are trying to do is so misinformation and disrupt and almost instill hopelessness. And the thing is, is there are absolute things that we can do as individuals and things that our organizations are expected to do by HIPAA, and increasingly, by things like CIRCIA and various other regulations that are coming out of Washington. So as an individual, both as a patient, but also as a practitioner, right, I touched upon picking good password, you know, add, make the bad guys lives harder by having a long and complex password, you know, if you can adopt something beyond the password into a passphrase, or look at these things like password lists. If you haven't already, I highly encourage you to enable multifactor everywhere. Make sure it's on your banking accounts, make sure it's on your email. MFA, step up authentication, whatever you want to call it, absolutely makes the bad guys lives much more difficult. It increases friction, and the economies involved in attacking you become much more expensive for the bad guy. So the hope would be they would move on to an easier target. But by that same measure, some of them are going after the very human weaknesses, the empathy that's involved in health care. I often have this discussion with my colleagues around zero trust, I think it's a buzzword. But the thing we have to understand is patients trust doctors trust. They've spent decades learning about instilling that trust. And there's so much around that reputation and that quality of care which the bad guys are trying to subvert. But at a bare minimum there are certain behaviors that we do socially in a face to face capacity that perhaps are harder to do in an online world. And phishing is one of those things that is far too easy. And that's one where you just need the right amount of what I call applied distrust, you know, you shouldn't be too trusting and whether it's coming in over an email or text or a phone call, soon it'll be defects, you know, these attachments, just just be wary, that we can all do better there. And the bad guys are certainly using some technology like AI to get rid of typos or grammatical issues or things of that nature. But just distrust, pick up the phone and call someone using information you find on the internet and find out, hey, did you really send this email, you know, am I Scott's good friend, really, let me just check. Social media is riddled with these kinds of things where entire social media accounts are being hijacked. And then the other thing that I think is very important that a person can do is is if they see something, say something. Report it out to your colleagues in information security, if you have a phishing reporting button, use it. Call helpdesk, call your bank, if you're seeing something. Report it into these various social media platforms. So there's other things that as an individual you can do, update, you can enable what's called encryption scrambled data on devices, it's becoming easier, a lot of the manufacturers are having secure by design baked in. So you know, choosing a good password, minimizing that kind of exposure a bad guy can use against you, that's something we can all do.

Scott Nelson  20:38  
And that sounds like something that is very simple that an individual can do, that a solo practice, the smaller group or even a large organization, it's a very simple thing that an individual or team can do to help the overall, you know, enterprise risk management situation.

Esmond Kane  24:12  
100% agree, you know, this is something that National Cybersecurity Awareness Month has been reinforcing for for decades, right, these are things that we all can do. It doesn't just protect us at work it also protects us at home. So whether we're dealing with an elder care issue or things of that nature, the bad guys are going to pretend to be you. They're going to pretend to be your kids. It's something that we can all be better at. But it also maps directly to some of the things that your organization, if you're a practice, if you have colleagues in IT that you work with, they can also do for you and they can make like your education a lot easier. They can do phishing simulations they can make sure that MFA is enabled on these secure portals. These repositories that contain sensitive information. They can also adopt a framework, to use your point from earlier, they can measure themselves against industry standards and they can create these policies. And hopefully those policies are human readable. They create this kind of culture of security, this culture of collective defense. But also, I'll be honest with you, one of the most important things that we can do, Scott, is just share and collaborate and talk. When things are presented in a more friendly manner, when we're not just measuring when people fail, but measuring people's success. It's so important as a corporation to create and champion those opportunities, not just focus on the negatives.

Scott Nelson  25:45  
When you talk about sharing information and measurements, what are the biggest challenges or obstacles we're facing in health care today related to information security and risk. A lot of times I hear that it can be cost. And these are, when you look at platforms, and you're having equipment, and a lot of these software upgrades and installations that are put on there on these platforms, it's always a cost issue. A phrase that I use is, it's a casino culture, where people are betting that nothing will happen to them and so they ultimately don't do anything, and hoping that something doesn't happen to them. But what could or should be done to address and overcome those challenges and obstacles? You mentioned sharing. You also mentioned measurements. If a group is not really familiar, and they don't have a person that's specifically in charge of this, what are some measurements that they should be looking at? Or what are some, you know, maybe one or two or three things that they should be building around a framework?

Esmond Kane  26:40  
Yeah, it's interesting. The first thing I'd almost say is, I touched upon it earlier, which is culture. Measure the susceptibility of your workforce to these threats. One of the things that we're required to do in HIPAA is to assess risk. And that includes all dimensions of the delivery of patient care. We need to know where our data is, we need to know how our assets are managed. But to your point, it can create this spiraling complexity, this spiraling expense. Firewalls aren't cheap, right, the staff to maintain them, the know how, it's all so potentially expensive. But these are table stakes, unfortunately, we have to do these things. When you're combating well-monetized, well-tooled, sophisticated threat actors there are things that we can do. And access control, minimizing the data, minimizing the exposure of that data to only those with the need to know, those are things that we can all do. There are solutions we can put in place to monitor so that when we are deploying these solutions that they're cost effective but also they're minimizing the impact, the inconvenience on not just the bad guys, where you want to see the impact, but on the good guys. I often think of the issue where let's say a physician is trying to deal or triage with a patient on a gurney. And if I'm making their lives harder, they have to go through a step up or something of that nature without due care, if I haven't created kind of a trust zone or a secure site, where I can relax some of those controls, then perhaps I've made the problem worse than the complaint, right, so there's definitely monitoring. You need to be aware of what could go wrong, you need to be aware of what those threats look like. You need to be continually monitoring these information sources, the news, on what's going out there. CISA's great on establishing key exposures and indicators that relate to what you should be doing and prioritizing. You can certainly invest in these tools like a security operation center and security incident and event management tools. Some of this is kind of invest in protective controls, to kind of minimize the harm and minimize your exposure to risk. But also you need to invest in reactive controls. So there's the ability to respond quickly. And there's a fast reaction force, a cyber SWAT team that can get ahead of it. But ultimately a lot of it comes down to humans. You need to be training those, need to be exercising those muscles. You need to be aware that you're having those dialogues with executives on how you're going to invest and how you're going to measure that return on investment to your point. It can certainly involve significant investment and the discussion with executives tends to be how you can invest upfront. You don't really want to have to invest after the fact when you've impacted patient care or there's data that's left your environment. That's the disaster scenario you're trying to avoid.

Scott Nelson  30:07  
Along the lines of the reactive controls and thinking that it's not a matter of if something will happen but when. What if something does happen? What should a response look like?

Esmond Kane  30:17  
These are tough questions, Scott, goodness, gracious. There are certainly a lot of frameworks that speak to incident response right now. So there's a lot of tried and tested knowledge. If there's one thing health care and every other industry has learned the hard way, is what's good incident response looks like. So you obviously need to contain, you need to eradicate the threat, you need to make sure that your containment measures are being monitored because the bad guys are very accustomed to making a lot of noise and is that noise a distraction? Is there something low and slow going on behind the scenes when you're busy looking at all the noise. So you continually reevaluate your controls to make sure they're effective, that you understand the scope and the nature of what you think happened is what actually happened. Beyond that point, then you need to be communicating, you may need to start remove systems from service, you need to be engaging your disaster recovery plan, you may need in health care to flip to what we call paper. So you may be working off of non-IT systems for a temporary location, for a temporary reason, or indeed, if it's bad enough, maybe you do have to relocate and go to stage a hot site and bring up some of your systems. You're also communicating to patients, you might need to communicate to your press or marketing office as you learn more about this. And if it is a real issue, you need to get ahead of it. There's regulations, like CIRCIA, they're gonna start requiring 72 hours. If you're a public entity there's now things associated with some of the SEC 10-K requirements as well. Ultimately, at this point, you're making sure that you've understood the issue, that you've communicated this and that you've put, hopefully, you know, the measures in place to limit the impact of that exploitation, that unusual activity. And at that point, you're doing your best to try and comply. And once you get to the compliance sphere there are things that HIPAA will require you to do within 60 days, you might need to notify your patient community, you might need to work with a third party to make sure that they have the ability to monitor if any data did leave, this in general, there's a well exercised playbook. There's also some NIST, I think it's 861, that goes into what those steps look like. And you can distill it down to something like observe, orient, detect, and act. Or you can go into the full spectrum seven stage that I kind of outlined as well, by the way it's observe, orient, decide, and act, but a lot of it comes down to communication. Making sure you're communicating internally, and when necessary, you're communicating externally. There's also a huge element of this of engaging the ISACs or some of the threat intel centers so that you might be communicating to colleagues and other peer institutions and providing them with indicators so that you're lessening the ability of a threat actor to take what worked against one particular health care organization and they can turn around and pivot and use it against another one pretty quickly. Sometimes they just use one compromised entity as a bridge head and they just start attacking another entity from yours. You might not be, you may be just a staging site, you may not be the end goal for these actors. Does that kind of help?

Scott Nelson  33:53  
Oh yeah and it just goes into my thinking of just how complex and complicated and just kind of the spider web piece to this. And that, you know, whether or not you are that large organization or you're that one to two doctor practice, doing all the, making the efforts and whether or not it is an informal, how you are doing the the password pieces that you that you talked about, and doing something simple, that is, you know, on the lower end of the cost spectrum, still helps because if something does happen, everything that you just talked through, doing whatever you can upfront makes everything so much better if something does happen.

Esmond Kane  34:35  
I would agree. I mean, ultimately, what you're trying to do is to measure that risk and make sure that it maps to your organization's risk appetite. And you're trying to mediate, you're trying to take what could be a high risk and what you're left with from a residual perspective, after you put some effective controls in play, it becomes much more, much less of a risk. I mean, there's absolutely, HHS has guidance out there, NIST is updating NIST 866 for the first time in over a decade right now, which is guidance on on applying NIST's guidelines to HIPAA. There's things, CISA's cyber performance goals, there's HHS guidance, there's the American Medical Association has some guidance and newsletters, there's the Healthcare Information Sharing and Analysis Center. There's a wealth of information out there. And you're encouraged to take action now. And as I mentioned earlier, increase the friction for the bad guy but hopefully not at the expense of patient care and the ability to actually provide quality care.

Scott Nelson  35:50  
Looking backward in time, what was expected versus unexpected in information security, and what could have been done differently to decrease risk vulnerabilities? And then looking forward, what do you see as trends and potential issues individuals and organizations should anticipate and prepare for?

Esmond Kane  36:07  
So I'd love to have a crystal ball, or your Magic 8 Ball even, to look at these risks and there're certainly lessons learned that we can forecast will become worse over time. I don't think that ransomware is going to let up, it's going to continue. There's a couple of things that ransomware touches upon that I haven't spoken to today. It's I'd mentioned at a high level some of the issues associated with patching and maintenance and asset management. But there's certainly a lot of outdated IT infrastructure and legacy systems that are in need of redress. I'd love to see at the federal level some assistance so that they don't just blame the victim but they help the victim of some of these ransomware actors. There's a lot of fragmentation and complexity involved in it. I'm a big fan of a book right now by Gene Kim's latest book on wiring for successful organizations. And he talks about defining that framework and almost in the frame of slowification. And once you slowify something, the idea is to simplify and then amplify so that once you break these digestible, simple steps into play, you can automate them, so we can certainly move faster. The other thing that, it's kind of related to some of these medical devices and IoT, these Internet of Everything, and certainly care in the home. It's the human element here. It's the human firewall, phishing emails, phishing attacks are continuing to work because some of the audience need to pay a little bit more attention. And in particular, in smaller health care providers, we're also dealing with the advent of something like double extortion, where the bad guys are threatening patients, not just the organizations that were compromised. Huge element there on developing the workforce, about modernizing the infrastructure. That includes elements of like standardization, adopting things like multi factor authentication, password list authentication. There's also working towards prioritization, adopting one of these frameworks and realizing that Rome wasn't built in a day, that you can introduce these controls, but AI is wonderful, but we don't want the cure to be worse than the complaint. We don't want it to start over-sharing information. We don't want it to start behaving like some of our less well-educated workforce. We also need to figure out, I think, across all of the US and indeed internationally, how to neuter the financial benefit to these threat actors. And I certainly tell my leadership, and you'll hear this from a lot of health care leaders and CISOs, that we don't want to see ransoms being paid. So how can we work with our colleagues in the financial sector so that the bad guys aren't able to monetize, they aren't able to gain from compromising these systems. And kind of the last thing I kind of would recommend is i'm a big fan of this technology I talked about it earlier it's called zero trust. And it's kind of this journey towards continually evaluating risk, continually evaluating threat and only providing access when necessary and revoking that access when it's not necessary. So I think it's mislabeled because I do think it's more towards the trust but trust not verify. We need to kind of evolve to see what our colleagues in the military space, in the more monetized financial space. They're doing better I believe then some of the health care organizations. So there's lessons learned there. As we adopt telehealth, AI, care in the home, this hospital in the home, this smart hospital, you know, we can't adopt AI without having some of this groundwork in play, and that's kind of a collective defense. We need good vendors, good suppliers, we need our patients to be well-educated, we need a workforce to be educated, we need our leadership to be well-educated, it's a spectrum of things, Scott, that we can do. Otherwise, we're just going to continue to see ransomware reap the benefit. And as they pivot to AI, deepfakes, there's just going to, continuing to create privacy concerns and impact patients.

Scott Nelson  40:48  
Well that's a great point to conclude our conversation. Esmond, thank you very much for your time and sharing your thoughts and experiences today. I really appreciate it.

Esmond Kane  40:55  
Thank you, Scott. If anybody has any questions or follow up, I'm sure we'll engage on your platform, or they can just find me on LinkedIn, more than willing to have a conversation.

Scott Nelson  41:04  
That sounds great. Thanks so much. 

Thank you for listening to The Risky Health Care Business Podcast. You can listen to all episodes from the resource center page of the SpringParker website, springparker.com, or click the Listen link in the show notes to listen and subscribe for free on your platform of choice. And remember, accelerating health care performance is achieved through creativity, not just productivity.

More Episodes

Load More