With adoption rates accelerating, tech companies are increasingly deploying new solutions to wearable tech. Some are learning, however, that privacy considerations may be lurking in unexpected places.
When deploying Epic's MyChart app for Apple Watch, for example, Nebraska Medicine recently learned that Siri’s voice capability had to be deactivated to avoid potential HIPAA violations, according to an interview with Healthcare Info Security.
Another privacy-related concern of wearable technology is the ability for technology companies to purchase and sell sensitive user information. Because companies that make wearable technology are not often “covered entities” under HIPAA regulations, they are not bound by the same rules against disclosure as traditional health care entities and business associates. Therefore, companies such as FitBit may have unlimited uses of data so long as it is covered under the user agreement, which is often disregarded by consumers. According to SearchHealthIT, a report by Privacy Rights Clearinghouse found that over half of the surveyed fitness apps shared highly sensitive health data with third-party analytical services, which may link heath data to other identifying data about the user.
Wearables may also transmit user-generated data without encryption, putting consumers at risk for identity theft, profiling, stalking, or even extortion. Recently, Symantec, a tech security company, found that one out of five tracking apps transmitted unencrypted user-generated data, according to USA Today.
When deploying new technology, remember to consider functionality that may extend beyond the bounds of your own product or service, and how that might affect privacy and data security.