Hackers Steal 600K Records from Health Care Firms – Could Your Wearable Device Be Next?

Knobbe Martens
Contact

Security firm InfoArmor published a report in late July 2016 stating that a group of attackers infiltrated American health care institutions, stole at least 600,000 patient records and attempted to sell more than 3 terabytes of that associated data.  In an interview with eWeek, chieAn example provided in InfoArmor's July 2016 report regarding the type of data hackers were able to obtainf intelligence officer Andrew Komarov noted that the hackers he investigated were able to compromise different health care institutions such as private clinics, vendors of medical equipment, and suppliers.  Once inside the compromised systems, the hackers were able to take personally identifiable information and medical data, including imaging data (as shown to the right).

Komarov’s research should come as no surprise in view of a report issued by the Brookings Institute in May 2016 reporting that 23% of all data breaches occur in the healthcare industry.  In fact, nearly 90% of healthcare organizations had some sort of data breach between 2013 and 2015, costing the healthcare industry nearly $6.2 billion.

According to a report done by Bloomberg BNA, while a number of legal mandates exist (e.g. the Health Insurance Portability and Accountability Act (HIPAA), the Health Information Technology Certification Program, and the Food and Drug Administration’s (FDA) premarket review), the existing guidelines are limited.  Furthermore, medical devices face certain unique cybersecurity pitfalls.  For example, while HIPAA applies to protect health information regardless of where it’s stored, protected health information that exists on disposed of or nonfunctional medical devices can be overlooked.

Connected medical devices (i.e., medical devices that can transmit information through the internet or a networked system) also pose unexpected risks and challenges.  For example, the ability for hackers to remotely access connected medical devices can hypothetically result in significant threats to patient health and safety.  A 2012 episode of the television show Homeland featured a character hacking into and manipulating the pacemaker of the fictional vice president.  While such situations seem far-fetched, in an interview on “60 Minutes,” it was revealed that Vice President Dick Cheney’s doctor had actually disabled the wireless functionality of his heart implant, fearing that it might be hacked in an assassination attempt.

While such fears may seem fueled by paranoia, recent studies have shown that such security threats may be a real concern.  Bloomberg Businessweek reported in November 2015 that the Mayo Clinic engaged a number of high-profile “Hospira-Symbiq-infusion-pumpwhite hat” hackers to conduct a study of cybersecurity vulnerabilities in their medical devices.  These “white hat” hackers worked on a number of different medical devices, including things such as cardiac monitors, infusion pumps, and hospital beds. In one alarming example, one hacker was able to gain control of an infusion pump – the Hospira Symbiq Infusion System – and was able to remotely cause it to deliver a potentially lethal dose of medication.  Shortly thereafter, the FDA issued a safety notice recommending a recall and the stopped usage of the aforementioned pump.Capture

With increasing concerns about cybersecurity, as discussed on this blog previously, the FDA is currently seeking comment on proposed guidelines that outline when software changes to medical devices would require manufacturers to submit a premarket notification.

Before a medical device can be sold on the open market, manufacturers are required to attest to the FDA that the product is secure and effective.  However, subsequent software updates may potentially introduce risk to a device’s cybersecurity or alter its functionality.

The current draft provides an infographic flowchart of six questions, as well as a series of examples, to help manufacturers and FDA staffers determine whether a notification is required.  For example, criteria that would trigger a new submission can include, for example:

  • changes that introduce a potentially hazardous situation
  • modifying or creating a risk control measure
  • affecting the functionality or intended use of the device.

But even with the FDA guidelines in place, studies have shown that simply bolstering cybersecurity on medical devices do not eliminate all concerns regarding the protection of medical data.

In an increasingly wireless world, an individual’s biological fingerprint can be taken without the individual’s knowledge.  Slate.com recently reported on the potential data mining implications of wearable devicsimple.b-cssdisabled-png.h4212c4916dfd03e82a8f54e8eebd5b3ces – such as the Fitbit or Jawbone.  While much of the data appears innocuous at first glance, when aggregated, such data could be used in unexpected – and potentially concerning – ways.

For example, a study recently conducted in Sweden and published in 2015 correlated a low resting heart rate with the propensity for violence.  While the conclusions of that study have not been validated, if, hypothetically speaking, a connection is confirmed in the future, such innocuous data could have serious implications.  As noted by John Chuang, a professor at Berkeley’s School of Information and the director of its BioSense lab, this data could subsequently be cross-indexed and introduced into algorithms to profile or convict individuals.

Furthermore, given society’s entrenched social interpretations for the meaning of a “faster heart rate,” incorrect interpretations and assumptions could be made that an individual is lying or nervous.  In one recent study currently undergoing peer review, it was found that participants in a trust game were less likely to cooperate with their partner if they had an elevated heart rate and was more likely to attribute some negative mood to that individual.

Such wearable data is also not protected by HIPAA or regulated by the FTC.  According to Slate.com, while wearable companies assure users that they are doing everything to protect accumulated data, when such data is transmitted using Wi-Fi, encryption and protective algorithms are unable to ensure the data’s security.

Research has also shown that in the near future, companies may actually be able to collect biosensing data from a user without a wearable device.  For example, Researchers at MIT have found that they were able to detect heart rate and breathing information remotely with a 99% accuracy simply by reflecting a Wi-Fi signal off an individual’s body.

As technology continues to develop to benefit our health and make our lives more convenient, we will be faced with new and unique privacy and safety concerns that beg the question of what we are willing to give up in the name of technological development.

As Professor Chuang notes,  “In the future, could stores capture heart rate to show hot it changes when you see a new gadget inside a store?  These may be things that you as a consumer may not be able to opt of.”

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Knobbe Martens | Attorney Advertising

Written by:

Knobbe Martens
Contact
more
less

PUBLISH YOUR CONTENT ON JD SUPRA NOW

  • Increased visibility
  • Actionable analytics
  • Ongoing guidance

Knobbe Martens on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide