Posts Tagged ‘Academics’

Dartmouth Study Finds P2P Networks Hemorrhaging Sensitive Data

While peer-to-peer may be a good metaphor for human interaction – social networking comes to mind – it may not always be the greatest model for the sharing of sensitive information.   Your medical history for instance, shouldn’t be shared with others on a P2P network.  Is this happening? Absolutely.  A study presented this week by Professor Eric Johnson of Dartmouth’s Tuck School of Business, describes how researchers found mounds of sensitive medical data on popular P2P networks: medical history, contact information, insurance details, treatment data, diagnosis and psychiatric evaluations – all mixed in with the song and movie downloads that usually make up the traffic on these networks.

So, how is this sensitive medical data getting on P2P networks in the first place?  Primarily through an employee’s computer – the employee downloads a P2P application on her work machine, and then uses that same machine to process sensitive medical data at work.  Sometimes the employee takes work home, making edits to a spreadsheet on her home computer (yes, a hospital-generated spreadsheet containing SSNs and other personally identifiable information for employees was one of the documents that the Dartmouth researchers found).  In both cases, the user configures the P2P application incorrectly, making all their personal data visible to other users on the P2P network.  Once that happens, the data is a prime target for cybercriminals and fraudsters who engage in identity theft.  Sensitive medical data is a particularly lucrative prize.  As Professor Johnson put it: “For criminals to profit, they don’t need to “steal” an identity, but only to borrow it for a few days, while they bill the insurance carrier thousands of dollars for fabricated medical bills.”

Arguably, this could be a potential area of concern for the companies covered by HIPAA and that deal with sensitive medical data online. But although HIPAA and the FTC’s Health Breach Notification Rule set out requirements for what companies need to do in case of a “breach” of sensitive medical data, they give little guidance to companies on what policies they could be implementing internally to prevent such breaches in the first place. Some may view this as a nod to self-regulation, but the truth is there are “best practices” that both HHS and the FTC could endorse.  A simple best practice that addresses the “data hemorrhaging” that Professor Johnson alludes to in his study, would be an internal policy against the use of P2P networks on machines that also handle sensitive medical data.  Another best practice – companies that deal with this type of data should consider partnering with regulators and health care providers to educate patients on the importance of securing their medical data – and how certain file-sharing technologies can promote medical ID theft when configured incorrectly.  Already, there’s collateral for such an effort – the FTC’s  tips to deter medical ID theft, which could be required patient reading (along with those HIPAA notices).

Categories: Regulation Tags: ,

Academic Survey Shows Public Discomfort with Targeted Ads

September 30, 2009 Leave a comment

Evidence of how the average consumer views online privacy is usually absent from the heated debates on this multi-layered issue (for more, see my very first post to the Balancing Act).  That may start to change after today.  A group of professors from the University of Pennsylvania and the University of California Berkeley have published the results of a survey of 1000 adult Internet users and found that most respondents were not comfortable with Internet marketers gathering data and then using that data to deliver tailored ads – a process known as behavioral advertising.

At least two of the survey’s results deserve a moment’s pause.  First, while over 66% of the survey’s respondents said that they were not comfortable with tailored ads, that number jumped to 86% when respondents learned how marketers gather the data that is used to serve tailored ads. Second, even in the era of living life online via Facebook or MySpace, 55% of the young adults (18-24) surveyed were not comfortable with tailored ads.  This means that  younger, Internet-savvy users do believe in some notion of privacy online.  Just these two findings should concern Internet companies that collect data and use that data to deliver tailored ads and content.  But the survey should also concern regulators – as these findings signal widespread unease with how products and services are marketed on the Internet today.

Discussing the survey in today’s New York Times, the authoring professors stated that the survey is “the first independent, nationally representative telephone survey on behavioral advertising.”  Hopefully, it will remind lawmakers of the importance of empirical evidence in evaluating the policy issues surrounding targeted ads.  And the timing could not be better, particularly now as a perfect storm brews around the online privacy issue – federal legislation to be introduced by Rep. Rick Boucher, the FTC’s upcoming privacy roundtables, and recent comments by David Vladek, the newly appointed head of the FTC’s Bureau of Consumer Protection, that could “upset the online advertising ecosystem.”

Indeed, the price for free web content is often advertising, including behavioral advertising.   Will consumers be willing to give up the variety of free content currently available on the web in exchange for content on websites that do not track Internet behavior?  Will consumers be satisfied with disclosures or perhaps opt-ins, or is online privacy a non-negotiable?  Perhaps the next study or survey will shed some more details on this Gordian Knot of an issue.  Let the questions begin.