Do No Harm. Matthew Webster
applications designed to make you healthier. What is interesting is that many fitness devices are eerily similar to IoMT devices that collect many of the same types of data as IoMT—in many cases, using the same types of technology. By all considerations, many of the devices are collecting HIPAA-like data, but the data they collect is not considered HIPAA data because the data created is not by a covered entity. A covered entity, defined in the HIPAA rules, is a health plan, healthcare clearinghouse, or health provider. Covered entities are beholden to HIPAA and have strong privacy and cybersecurity requirements. Data from health devices, despite the similarity to health data, does not have the same privacy or cybersecurity requirements. Data from health and fitness applications oftentimes has a great deal of additional information about you such as where you are, where you have been, personal information such as your address, and so on. These “free” applications mean you give up information about yourself, which is healthcare-like information.
A challenge with many of the health applications on the market is that some of them are providing health advice without sufficient science behind them to back up the claims. Within iTune and Google Play stores, there are more than a hundred thousand health applications. There have been numerous fines against many of these companies, but given the relative ease of designing apps and getting downloads, it becomes an almost impossible task of keeping track of them all and determining which are legitimate and which are not. Making an unsubstantiated claim may ultimately harm some people. The FDA has made recommendations for companies or individuals who develop these applications, but not everyone follows those recommendations.
There are also a host of companies that focus on your family tree based on some personal information and your genetic information. Today, that information can tell a tremendous amount about you. While not all genetic tests are equal, generally speaking genetic testing can tell if you have a genetic predisposition for specific diseases. The FDA prevents these companies from doing any kind of diagnostics, however.8 What these companies do is reference key information against publicly available databases—some of which have incorrect information. In the end, from a disease standpoint, the tests only have a 40% efficacy rate.9
Like fitness devices and applications, genetic testing that is direct to consumers is not covered by HIPAA. In many cases, it is the same as HIPAA data, but because it is not coming from doctor or hospital, it isn't afforded the same protections. The data walks like a duck. It quacks like a duck. It is a duck, but it does not have the same security considerations as the other ducks because it did not come from a doctor or hospital.
It should be pointed out that just because the data is not HIPAA data, it does not mean that the data is not sensitive. That additional information like name, address, and phone number is sensitive information. It is considered personally identifiable information (PII). PII is essentially information that can help identify someone including Social Security numbers. In the United States, PII must be protected, but the protection requirements are much less stringent for PII than it is for Protected Health Information (PHI). PHI is the data that is protected under HIPAA. It includes PII, but also the information required under covered entities. In the cases of fitness devices, genetic ancestry testing (not performed under a covered entity) the data is PII but also has health data that is not governed by the HIPAA law. Oftentimes, that means that the data is less secure.
But the story of this non-HIPAA medical data does not end here.
Data Brokers
In today's data-centric world, data is the new gold. Perhaps nowhere is that truer than with data brokers. In the simplest terms, data brokers source and gather data and then resell the most important parts. Today, some extremely large and well-known companies, including Oracle, McKinsey, Accenture, and Experian, act as data brokers. It is a multi billion-dollar-a-year industry that is not just growing rapidly from a business perspective, but ballooning along with all the sources of data.
From a health perspective, a large percentage of the data that data brokers have is from hospitals, doctor's offices, ACOs, and so on. That data must be anonymized prior to sale. In fact, Health and Human Services (HHS) has a 32-page document stipulating the requirements around de-anonymized information.10 Quite often, though, the byproduct of the anonymization process is that age, gender, partial ZIP codes, and doctors' names are still in the data.11 It becomes relatively easy in most cases to match that data with other large data sources. In fact, that is a challenge for healthcare adjacent organizations that work with the data. They can and do send out unmatched data against alternate sources in order to accurately identify individuals—all in the name of helping people. Many of these legitimate companies do not permanently save that information. That is not to say that data brokers are illegal; it is just that they have found a way to profit off a loophole in privacy regulation given more modern capabilities.
We, as individuals, in most states, do not have the right to prevent that data from going to data brokers—or to delete it once it is part of that data ecosystem. That means our data is often shared without our knowledge with an array of different brokers that can make use of the information in literally untold ways. In some cases the data really does move science forward, but in other cases it is used for advertising purposes. Each piece of data is valuable in different ways to different organizations.
We are just beginning to learn about that market, but Patientory estimates that it is a multi billion-dollar-a-year market for healthcare data.12 Putting together the full list of your doctor visits, blood tests, prescriptions, IoMT, and so on, is extraordinarily valuable. But so far we have been talking about data from covered entities that are being used. Data brokers also pull data from health applications, fitness watches, and genetic testing. Tie these data sources together and it is a cornucopia of valuable data. In the end, data brokers can gather thousands of data points on any given person. They quietly sell your personal information without any of us being the wiser.
If we sidestep the conversation about HIPAA data and look more keenly at some of the uses of other kinds of data, the story takes an interesting turn. A tremendous amount of available data relates to location data from a variety of sources. Some of the sources include loyalty cards, public records, social media posts, cell phone data, browser behaviors, and so on in order to get as complete a view as possible about consumers—about you and me. 13
There are some concerns about the accuracy of the data, however. Harvard Business Review did a test by gathering data from multiple data providers to assess how accurate it was. The results left much to be desired. For example, identifying if someone was male or not only had a 42.5% accuracy. Age was a little better at 77% accuracy.14 But what does this mean if inaccurate information is mixed in with personal details such as name, address, and phone number? On the innocent side of things, you will gain advertisement you may or may not be interested in. On the opposite side of things, false information can prevent you from getting a job, or worse.
To address this, the Federal Trade Commission (FTC) is looking into studying the harm that can befall consumers as a result of bad data. Some of the harms they have noted include predatory pricing and racial profiling. Many key companies change their prices depending on the location of the individuals or their racial and ethnic backgrounds.15 In another case the FTC cites, Google was allowing illegal pharmacies to target users in the search engines—for which they paid a $500 million civil forfeiture.16 The FTC even goes so far as to say big data helped to facilitate the subprime mortgage crisis in the mid-2000s.