Monday, March 26, 2012

Lessons from female crash test dummies

The Washington Post had an interesting article yesterday about the use of female crash test dummies to test car safety in 2011 cars. These dummies, which are lighter and smaller than the male dummies used in the past, provide very different safety data than the male dummies. Some cars tested with the female dummies now have lower safety ratings than they had with male dummies ("Female dummy makes her mark on male-dominated crash tests").

Women drive less than men, but because they tend to be smaller than men they are more vulnerable to injury in car crashes, writer Katherine Shaver points out. They are also more likely to be in the passenger seat than in the driver's seat when traveling with others, which increases their risk for injury and death.

Despite these gender differences, the National Highway Traffic Safety Administration has used only male dummies to create crash test ratings for cars in the past. The story about the use of new female dummies in the 2011 ratings is worth reading to better understand the true risks of driving for women, which the auto industry can address with improved safety features.

Thursday, March 15, 2012

How mobile phones improve public health

Smartphones and cell phones are increasingly playing a role in public health. Combine a smartphone with crowdsourcing and infectious diseases, for example, and you end up with a novel concept called participatory epidemiology. The Outbreaks Near Me app for iPhone and Android, for example, released during the swine flu epidemic in 2009, collects data from users and from the media to track the flu and other diseases in every part of the country. Install the app and you can see what disease might be tearing through your city right now (in my neck of the woods, it's the highly contagious Norovirus).

Mobile phones are also used to manage the response after disasters and disease outbreaks. In an interesting article about participatory epidemiology written by the creator of Outbreaks Near Me, the authors cite applications such as Ushahidi, used to manage the response after the Haitian earthquake in 2010, and GeoChat, used to track diseases in Southeast Asia (you can read about the pros and cons of participatory epidemiology in the article "Participatory Epidemiology: Use of Mobile Phones for Community-Based Health Reporting").

For health care providers in resource-poor countries, tracking data on a smartphone is actually easier than tracking it on paper, according to a study released by the CDC this week ("Smartphones more accurate, faster, cheaper for disease surveillance").  In the study, providers collected data for questionnaires about influenza in Kenya using either smartphones or paper forms. The smartphone data was more thorough and (since it was digital) quicker to access in a database than data recorded on paper. What's more, the 2-year cost of implementing smartphones to fill out these questionnaires was also lower than the cost of collecting the data on paper.

More widespread collection of health care data can serve the public good. Perhaps that's how health care data should be seen: as a public asset that we can all contribute to and that we all benefit from. As more health care data is digitized for easier access by public health officials and other entities, however, it becomes increasingly important to protect the privacy of each contributor.

Friday, March 2, 2012

Should you check your DNA?

In medicine, aggregated personal health data, including genetic information, is extremely valuable, even lifesaving. Researchers can analyze data from electronic medical records to find patterns of diseases, see how well treatments work across broad populations, and to pinpoint risk factors for a variety of ills (identifying details for each patient, such as name and address, are blocked to protect the patient's identity).

Some organizations also gather and analyze genetic information to combat specific diseases as a public service. One such organization is the bone marrow donation site DKMS Americas, which collects genetic information from volunteers to create a database of potential donors for cancer patients. This service has saved and improved many lives over the years.

But what happens when the analysis of genetic information becomes a consumer product? The testing company 23andMe, for example, collects a DNA sample from a you and analyzes it to tell you your genetic heritage and your risk for various diseases. The wellness company GeneMe collects DNA to create personalized supplements to improve your health. And the list goes on. These companies encourage customers to use their knowledge of disease risks to make healthy lifestyle choices that might reduce these risks.

In an "Ethicist" column in last Sunday's New York Times about DNA testing companies ("Sizing Up the Family Gene Pool"), however, Ariel Kaminer explains that consumers don't always have control over or knowledge about how their genetic information is used:
Many of these companies do use customers' data for medical research or commercial applications, or they sell it to third parties whose interests you might never know. Legally they can't do that without your consent, but the fine print on those consent forms goes by so quickly that it can be hard to follow.
Kaminer also points out another problem: the results of these tests can be wrong. "A Government Accountability Office investigation into so-called direct-to-consumer genetic testing found inaccurate results and exaggerated claims about how much those results could really tell you," she writes.

Your genetic information is legally protected by the Genetic Information Nondiscrimination Act of 2008 (GINA). The law prohibits discrimination against individuals by employers or health insurers based on the individual's genetic information, which includes genetic tests and genetic diseases in that person or their family members.

But consumer genetic testing is new territory. It's true that this genetic information might be useful to the consumer and the consumer is legally protected against discrimination. But what if the information is leaked or hacked, or the data is misused in unforeseen ways, or by new, yet-to-be-invented technologies? The recent furor over Google's privacy policy demonstrates that we don't always understand what we're giving up when we use new technologies.