Do No Harm. Matthew Webster
medical devices. As the business world must evolve or die, so must software live within those paradigms. To that end, software security has often taken a back seat compared to other disciplines. There is an old joke that has evolved since 1997 about how unstable Windows is and comparing its operability to that of a car.41 The joke is much more involved, but it does serve to illustrate the problems with software development even back then—not just from a security standpoint, but an overall operability standpoint.
Automobile engineers have to have high safety standard. People can die as a result of a car's brakes not working (for example). With software it is historically acceptable to have flaws because, generally speaking, lives are not on the line. The discipline never developed the kind of rigor typically found in automobile engineering. Software coding is a little bit more of art than science in many cases because there can be numerous ways of obtaining an objective. The end result is systems that are riddled with vulnerabilities. Windows 10, at the time of this writing has 1,111 known vulnerabilities.42 The more complex a system is, the more likely there will be risks associated with those vulnerabilities.
What is almost as concerning as the number of vulnerabilities is how we got here. In 2016, Global Newswire published the results of a CloudPassage study of United States’ universities failing at cybersecurity. The key findings brought out by the study may be jaw dropping for the uninitiated, but not surprising for those in the cybersecurity profession. The most startling finding was the almost complete lack of security required by computer science programs for graduation. Of the top 36 Computer Science Programs (according to U.S. News and World Report) only one had a cybersecurity required course. According to Business Insider's top 50 list, there were only three programs that required cybersecurity to graduate.43
When it comes to cybersecurity, quite often universities are not the place to get that education. People walk out of school barely cybersecurity literate, but eager to start building IT systems. How secure do you think those systems will be if no one educates them on how to build secure systems? While there are certainly exceptions and companies have degree programs in cybersecurity, it does show the extreme deficit about the methods for protecting organizations. People who are interested in cybersecurity either need to learn on the job, go to a very specialized school, or go get cybersecurity certifications.
From a software development perspective, organizations need to supplement the understanding of the workers to get on board. Further, the lack of cybersecurity education helps to contribute to a lack of understanding of cybersecurity within organizations. That, in turn, affects the culture of the organization and ultimately the cybersecurity posture within organizations. Only companies with strong regulatory requirements or that have gone through a breach feel that they need a team to get them up to speed. Some of the requirements of cybersecurity may even appear bizarre due to cybersecurity illiteracy.
We could stop here, but the story is really more complex than that. Todd Fitzgerald, in his book CISO Compass, brilliantly lays out the course of cybersecurity over the last 30 years. He points out from the 1990s to 2000 that security was seen as an IT problem—basically login security, passwords, antivirus, and the firewall. From 2000 to 2004 we started to see more regulatory security practitioners as the regulations began kicking in, so there was an emphasis on the regulatory landscape. From 2004 to 2008, there was a turn toward a more risk-oriented approach to doing information security, and from 2008 to 2016 the move was more toward understanding the threats and toward understanding the cloud. After 2016, the move was to privacy and data awareness as another aspect of security evolution.44
Having worked in IT and security for roughly 25 years, I have seen firsthand the evolution that Todd Fitzgerald is talking about. All of these are extremely valuable insights that demonstrate the growth and change within the information security landscape. Given that company culture can take up to five years to make changes45 (depending on the size and speed of cultural change), often businesses are well behind the curve of what they ought to be doing from a cybersecurity perspective (depending on the organization, of course). That said, appropriate momentum is required to make the appropriate changes. Obviously, for areas of resistance or where there is a lack of knowledge, such as in software development, changes can take quite a bit longer.
Another side of the equation is business perception around value. When IT was an up-and-coming phenomenon, many businesses perceived it as a cost center. They did not want to put the time and effort into supporting the men and women in that department. IT is now seen as a business enabler. Information security has had one foot in the cost center arena according to some businesses. The higher-risk and more highly regulated businesses elevated security more quickly as a business enabler—partially because it was. Various vendor risk programs required the security be heightened in order for business to commence. In heavy regulatory environments, they often had data breaches that cost them more and therefore security was given more clout to get the work done—they were not merely a cost center. They were protecting the business. They were seen as a business enabler. In these environments, despite the lack of education, they are able to form stronger cybersecurity practices—generally speaking.
Microsoft spends over a billion dollars a year on security, and some of that is devoted to patching the vulnerabilities in its operating systems.46 Not all companies are as diligent as Microsoft, but it does demonstrate that even security-conscious companies like Microsoft still find vulnerabilities in their code. In the IoMT world, not all companies will invest in the legacy systems to the degree Microsoft will. This can lead to vulnerabilities remaining on the systems. What is worse, many internet-connected medical devices cannot have agents installed on them.47 On Microsoft Windows, there are hundreds if not thousands of agents that can be installed to help protect the system. That single fact alone makes protecting internet-connected medical devices more challenging—alternative strategies must often be utilized.
All of these are influences on the security of internet-connected medical devices, but they still do not tell a sufficient story. Obviously, internet-connected medical devices are influenced by the prevailing culture, but the security behind these devices has often lapsed well behind the security of other pieces of software. This, of course, does not mean there isn't software that isn't riddled with flaws, but it does mean that medical device security has often taken a back seat to security requirements if security is recognized at all. Combined with the constant drive to innovate, this only exacerbates the security challenges.
To top it all off, the manufacturers are doing what they can to limit their liability in case things go awry. In March 2019, Bethany Corbin elucidated the challenges both brilliantly and clearly the legal challenges affecting internet-connected medical devices:
“Compounding this issue of non-regulation and insecure code is the lack of a comprehensive and workable liability framework for consumers to follow if their IoMT device malfunctions or is hacked. The application of tort principles to evolving IoMT devices is imperfect, creating challenges for plaintiffs seeking damages. In particular, the numerous actors in the IoMT supply chain make it difficult to apportion liability, with no clear boundaries establishing which party is at fault for a hack or breach. Similarly, defect-free software does not exist, which complicates the application of strict products liability. Further, end-user licensing agreements contractually limit manufacturer liability for defective devices, shifting the risk of harm to consumers and eliminating manufacturer incentives to comply with cybersecurity best practices.”48
In Summary
In the end, we have a tremendous