23andMe, the genetics testing company, is in a state of constant evolution, as you’d expect any 12-year-old company would be. But that also means that customers need to be aware of how the company is using data that users may have earlier consented to give without anticipating its newer initiatives.
One new tie-up was a particular point of interest here at TechCrunch’s massive Disrupt show, taking place this week in San Francisco. Specifically, CEO and co-founder Anne Wojcicki was asked a series of questions about 23andMe’s pact with pharmaceutical giant GlaxoSmithKline, which announced in July that it acquired a $300 million stake in 23andMe in order to more efficiently develop drugs. As part of the four-year-deal, GSK gains exclusive rights to mine 23andMe’s customer data to more quickly and efficiently develop drug targets. Said Wojcicki of the partnership: “If we start with genetics, will we have a higher success rate” when it comes to drug development? (She clearly thinks so.)
As Wired reported last month, this isn’t entirely new terrain for the company. 23andMe has for the last three-and-a-half years been sharing insights gleaned from many of the more than 5 million people who’ve sent their spit to 23andMe. It just used to be that it shared that information with GSK and six other pharmaceutical and biotechnology firms. Now, GSK alone will be able to access what Wojcicki describes as aggregated and wholly anonymized customer information.
Even still, in an age where privacy leaks are rampant, 23andMe customers have expressed some chagrin about the deal, and Wojcicki’s chat today might not assuage them. The reason, as she underscored: 23andMe customers aren’t being asked to opt in to this data-sharing agreement, but rather, they are being told they can opt-out via email. The move assumes that someone who bought a 23andMe kit years ago will respond to an email from 23andMe that gives them this option, when, let’s face it, many may never even see the email, let alone open it.
What consumers may well like better is the future that Wojcicki imagines for 23andMe, one that focuses not so much on drug development but also, and perhaps even predominately, on prevention.
The idea, said Wojcicki, is to rely on 23andMe’s “community” of customers who tell the company “all kinds of things” about themselves — then potentially figure out connections between these disparate things. “Some people have Crohn’s disease. Some have heart issues. People have everything,” and sometimes at once, she suggested. Meanwhile, 23andMe is uniquely able to help figure out links that siloed research cannot.
Relatedly, the company hopes to do more to help its customers manage conditions that they may be prone to develop. If someone appears to have a heightened risk of macular degeneration, for example, 23andMe might suggest that the customer wear sunglasses and take vitamins and get tested as soon as possible. If someone appears to have a heightened risk of developing Parkinson’s disease, the progressive nervous system disorder that 60,000 Americans are diagnosed with each year, 23andMe hopes eventually to be helpful in preventing or mitigating the outcome of the disease, she said.
What 23andMe will never do, said Wojcicki, is work with police departments to help it identify any of its customers. As she explained it, 23andMe requires “a lot of saliva from customers specifically for privacy issues.” (She noted that a smaller amount — as with drool that might escape the mouth of someone who’s asleep — isn’t sufficient.) 23andMe also prevents people from uploading data from outside sources in order to try to make connections, as happened in the case of the so-called Golden State Killer, wherein investigators used an open-source genetic database, GEDmatch, to explore family trees and see whether any contained matches to DNA samples from the crime scenes they studied.
It worked. The killer was caught. But 23andMe has a moral obligation only to its customers, she said. When law enforcement knocks, said Wojcicki, “We say no.”
You can check out the full discussion here.