Sea turtles: Nature’s smartphone

The recent publishing of the green sea turtle genome brought back memories of my first encounter with baby sea turtles, and one of the most fun and astonishing examples of hardwired behavior I know of.  I had the good fortune to travel to Heron Island in the Great Barrier Reef during my undergraduate days.  We were there in March and April, the Down-Under Fall, when the sea turtles hatch and begin their journey to–well, to be eaten, frankly, although a few percent do make it to adulthood.

The first nest that I saw hatch emerged in the afternoon on a rainy day.  Normally sea turtles come out of the nest and make their way to the water at night, and use cooling temperatures as a cue.  Rain fools them into thinking it’s dark, and this hatch was making their break in broad daylight.  We watched the baby turtles–each no bigger than a big kiwi fruit–flop awkwardly towards the water, alternating their flippers and dragging themselves forward.  We shooed away the seagulls who watched us with petulant expressions, since the normal fate of a baby sea turtle hatching during the day is a short trip down a seagull’s throat.

But none of this is the behavior I was thinking of.  Rather, the amazing thing is what happens when you pick up a baby sea turtle.  Once the pressure of the sand beneath its belly is gone, the motion of its flippers magically switches from alternating to synchronized, both flippers flapping in unison like the oars of a sculler on Lake Union.   Even more remarkable is what happens when you tilt the turtle from side to side, or front downward or upward.  The rear fins, useless on land, become rudders, turning in just such a way as would correct the turtle’s posture in the sea so as to keep it level and moving straight ahead.

Tilt the turtle head down, both rear fins bend up.  Tilt it to the left, the left rear fin bends down and the right fin bends up, while one front flipper stops moving and the other churns frantically.  The newly hatched sea turtle needs to get out to sea fast, to avoid the predators in the reef along the way, and its coloring (dark on top, light on bottom) are optimized to make it difficult to see as long as it remains level.

This is what evolution does.  It takes random variation and selects for those combinations that lead to reproductive success.  If those variations have their root in genes, those genes get passed along.  I’m curious to see if anyone tackles the question of how the turtle keeps steady, now that we have the tool of its genome to help.  I’d love to learn what goes into hardwiring this kind of behavior and who knows?  Maybe nature’s figured out some tricks that the cellphone makers don’t know.

I-5 as a metaphor for targets in drug development

All opinions are my own and not necessarily those of my employer.

Yesterday a jack-knifed FedEx truck heading south on I-5 crashed as it passed Seattle’s downtown.  Traffic backups spread for miles north, and spilled onto all the other routes leading from the north of the city.  Tens and maybe hundreds of thousands of people had their days disrupted.  My commute, which should have taken 15 minutes, took most of an hour.  It occurred to me, as I sighed and listened to KUOW, that the situation was a nice metaphor for how I’ve been thinking about drug development lately.

We work to discover targets that will have a substantial effect on human biology, hopefully in the direction of improved health.  But health is a complex phenotype, arising from a network of interactions at all kinds of levels–molecular, cellular, physiological.  One thing we know from network theory is that interconnected networks are stable and redundant.  Seattle, or any city, is also a network of networks, and for the most part damaging one part of the network (pothole repair in Ballard, say) might cause some local effects but no real change to the overall phenotype. But there are a few nodes, like I-5, that can have a substantial effect on the whole when something happens to affect  them.

I’m starting to think of drug targets this way.  We talk about finding better targets with fewer side effects, but I wonder if that’s possible.  It’s kind of a yin-yang thing.  Any gene with a large enough effect when targeted  to disrupt the networks and subsequently the phenotype will by its nature have multiple effects.  I’m probably wrong, but it will be interesting to see what new drugs and new approaches come out in the years ahead.

Greenhouse Gas Is Changing Ocean Ecosystems


This article originally appeared in Real Change in 2005.  I wrote it about one of my main concerns regarding climate change.   


The oceans have buffered the effects of man-made carbon dioxide in the atmosphere, but at a potential cost to the organisms living in the oceans’ upper layers. Scientists at the Pacific Marine Environmental Laboratory on Sand Point Way are part of an international team who discovered that half of the carbon dioxide produced by human industry has ended up in the oceans instead of remaining in the air. They reported their findings last year in the journal Science.

Their research represents the culmination of a 15-year effort to measure and interpret the role of the ocean in the global carbon cycle.

Over the past two centuries, although the amount of man-made carbon dioxide in the atmosphere has steadily increased, only about half of the expected increase was seen. Where the other half went was unknown.

These studies represent “the first time we’ve taken direct measurements to show that the oceans take up man-made carbon dioxide,” says Dr. Chris Sabine, one of the primary authors on the reports. Dr. Richard Feely, another of the lead authors, adds: “These numbers are used to constrain the global carbon cycle models. We need to have these constraints to know if the models are working properly.”

Building accurate models for the movement of carbon dioxide is of particular importance because carbon dioxide acts as a greenhouse gas. As the amount of carbon dioxide in the atmosphere increases, more of the heat from the sun is trapped near the earth’s surface, potentially leading to an increase in average temperatures around the world.

This research also demonstrated the effects of carbon dioxide uptake on the oceans themselves. “People seem to have very strong feelings about global warming,” says Sabine. “But whether you believe in global warming or not, we are adding huge amounts of carbon dioxide to the atmosphere, and that is measurably changing the chemistry of the oceans.”

Carbon dioxide’s potential to affect the environment was recognized over a century ago by the Swedish chemist Svante Arrhenius in 1896. Since then, scientists have struggled to accurately measure and model the global carbon cycle—the movement of carbon dioxide into and out of the many components of the environment such as the forests and the oceans, as well as the man-made inputs from burning fossil fuels, cutting down forests, and producing cement.

Carbon dioxide also deserves particular attention because it has an extremely long retention time in the atmosphere. Once it is released through a process like deforestation, it takes thousands of years for an ecosystem to re-absorb it.

The international team, a coalition between two consortia—the World Ocean Circulation Experiment (WOCE) and the Joint Global Ocean Flux Study (JGOFS)—measured carbon dioxide levels in ocean waters across the globe and at several depths. They compiled measurements demonstrated that the surface waters of the oceans show a net uptake of about 118 billion metric tons of carbon from the air over the past 200 years.

“The surprise was not that the carbon dioxide was there, but how much,” says Feely.

At the same time, the absorbed carbon dioxide is changing the chemistry of ocean waters and jeopardizing some inhabitants’ survival.

As carbon dioxide is absorbed by the upper layers of the ocean, it causes a drop in ocean pH. As this happens, “all organisms that form calcium carbonate shells and skeletons, from [some types of plankton] to the coral reefs—all of these species will have a harder time producing calcium carbonate,” says Feely.

Several studies under controlled laboratory conditions have demonstrated how, when ocean conditions change due to carbon dioxide uptake, marine organisms produce less shell or skeleton material. These experiments suggest the potential for large effects on marine ecosystems.

The actual pH changes are small; according to Sabine, the ocean surface pH has dropped about 0.1 pH unit.

“If anyone’s ever had a fish tank, you know pH is very important to control,” says Sabine. “If you let pH get out of control, all the fish will die.”

While the changes in the oceans’ chemistry should not be that extreme, many of the species that may be affected by reduced pH are at the bottom of the food chain. Changes in these populations could therefore have far-reaching effects.

Feely sees greater problems in the future, given predictions that the amount of carbon dioxide in the atmosphere will likely more than double to 700 to 800 parts per million by the end of the century if changes are not made.

“You would see very significant changes to surface ocean chemistry,” he says.

Some policy makers appear to have noticed. Senator John McCain invited Feely to testify before the Senate Committee on Commerce, Science and Transportation in September of 2004. Sabine described the senators as “very interested” but was unsure about the policy impact the testimony would have.

Feely believes there is a need to decide what to do about man-made carbon dioxide emissions as soon as possible. Carbon dioxide will remain in the atmosphere for millennia, even with a decrease in man-made emissions. “What we do over the next hundred years will affect man over the next several thousand.”

Some thoughts in response to an article on why drug development is so hard

This comment was originally posted in response to an article by David Shaywitz at Forbes.  I hope to eventually expand on these ideas in a later piece.

Thanks for the interesting take on the pharmaceutical industry and the problems of finding truly new and innovative drugs. The reasons you put forward are, I think, a large part of the problem. Biology is hard and we’re discovering just how hard it really is. Just as an example, something like ENCODE comes out and one learns about the vast amount of transcription going on in the genome across the many different cell types profiled, and one realizes there is no clear way to make sense of all of it, or even really know how much of it actually means something and how much of it is noise. As other examples of how hard it can be, some of the other commentors have pointed out the inherent unpredictability of biology, including the lack of translation from in vivo, controlled results in a dish to organismal biology, and also the complexity of a system with billions of moving parts.

While I’m still optimistic about Systems Biology approaches, I don’t have much faith in top down engineering models employing circuit diagrams and differential equations. Systems level measurements have shown how much variability there is among individuals in things like transcripts, proteins, metabolic rate, etc. And yet at the same time organisms generally function despite undergoing what amounts to a complete rewiring every generation due to genetic recombination and gamete fusion. It may be that a better understanding might come from study not of the specifics but of the generalities of systems that allow them to remain stable despite the diversity of all the parts. Trying to comprehend how evolution has solved the problem of balancing stability with variation.

Another aspect of the problem facing pharma today I think has to do with being victims of our own success. Many of the biological approaches of the last century, including the phenotypic screening mentioned earlier, helped illuminate many of the major pathways and a lot of the key regulators in human biology. That wave of information helped inform the highly successful drugs developed in the 80s and 90s. However, once you have a decent drug, it’s difficult to go one better. Improving on a statin is hard. Anything obvious, with a big effect on biology (and, it must be noted, big side effects) has probably already been found.

I’m reminded of the metaphor of the adaptive landscape from evolutionary biology. The concept is that a given phenotype of a species occupies some position on a landscape consisting of peaks and valleys, with peaks representing local maxima for fitness, and valleys representing poor fitness. In evolution, favorable mutations allow some phenotypes to move up the side of a peak, approaching ideal fitness. I sometimes think of drugs today as occupying a similar fitness landscape with peaks representing diseases and many of our existing drugs positioned near their respective therapeutic peaks. Once you start moving up a given peak, it’s progressively harder to make a change that will move you closer to the top as opposed to sideways or backwards. So you get Zaltrap and Avastin.

The way out is to change the landscape itself. To stretch a tortured analogy further, your comment, David, about possible new therapeutic modalities could be likened to the development of the first feathered, gliding dinosaur. Suddenly the adaptive landscape changes, shuffling the peaks and valleys so that they can now be exploited in different ways. Maybe a radically different kind of drug, like siRNAs once were thought to be, could completely revamp the drug development space. Fundamental insights into biology might do the same.

Disclaimer: all opinions are my own and do not necessarily reflect those of Novo Nordisk.

Sequencing Seattle: An Idea for the Future of the Northwest

This post originally appeared on April 3, 2013 in Xconomy.  The views presented are solely those of the author and do not necessarily reflect those of Novo Nordisk.



Let me make a modest proposal:  Seattle should commit to sequencing and interpreting the genomes of every willing member of its population and should do it within the next five years.  This program would elevate Seattle to the forefront of personalized genomic medicine, leverage many advantages unique to our area, and create a vibrant and sustainable economic engine that will drive our region for decades.


I can hear the howls of protest now, especially from the halls of the City Council and the mayor’s office. Seattle faces a lot of challenges.  Like most cities, Seattle is staring at revenue shortfalls, relatively flat growth projections, and a host of problems in environment, public health, education, social services, and infrastructure.  These are all pressing, immediate needs.

But at the same time, focusing too narrowly on the short term and not investing in infrastructure leaves a city’s continued relevance and growth at risk.  And while the planned waterfront tunnel is an example of a commitment to physical infrastructure, I’m talking about a similar commitment to infrastructure that supports, enhances and grows human capital, which I think will be at least as important for Seattle’s future.  Committing to genome sequencing is that kind of infrastructure, and when combined with the business, research and health synergies that would come out of such a commitment, this path makes a lot of sense.

Next generation DNA sequencing is a paradigm shifting technology.  It’s analogous to how the creation of the integrated circuit in 1958 led to the personal computer revolution via exponential growth in processing power.  Biomedical research today is undergoing radical change due to Next-generation sequencing platforms that started with Roche’s 454 in 2005 and continue today with platforms like Illumina’s HiSeq, Life Technologies’ IonProton, and others.

Over the past five years the cost of DNA sequencing has dropped at a rate that betters Moore’s law.  The original human genome draft sequence cost ~$2.8 billion dollars and was announced in 2001 after a decade of work.  Today, a human genome sequence can be had for about $5,000 and will arrive in as few as a couple of weeks.  Most experts in the field expect the cost of a genome sequence will drop below $1,000 in the next 3-5 years, and there’s no reason for the drop in cost to end there.  The $100 genome will happen.

By committing to sequencing everyone, Seattle will realize several key benefits.  On an immediate level, a large part of the cost can go right back into our community.  I’ll admit, $100 per genome, is a bit disingenuous.  The real cost of genome sequencing is not the outlay for the sequences but rather the costs of storing, analyzing, annotating and communicating the results of the raw data.  A commitment to sequence everyone in Seattle would require an additional, greater outlay of funds for analysis, and that money could funnel into our local institutions.

Few places in this country have comparable skill in genome analysis as the UW Genome Sciences Department, and none is better.  The UW Genome Sciences Department would be the logical and (I expect) willing partner for building and maintaining the analysis pipeline.   Likewise, the Seattle area is home to several companies, including Microsoft and Amazon, that would make natural and highly interested parties for storing data, assisting in the annotation, and providing platforms to help standardize and democratize giving data back to Seattle’s residents in easy to use, easy to understand, and portable online tools.  Imagine the cocktail party chatter when you whip out your genome app.

A commitment to sequencing Seattle would also provide an irresistible draw to all kinds of smart, innovative, ambitious professionals, such as bioinformaticists, genetic counselors, database and internet security specialists, clinicians, project managers, biomedical researchers, and public health administrators, as well as people who will create the occupations we don’t even know of yet, that will be needed to deal with this kind of data and infrastructure.  Economies aren’t driven solely by commodities and resources.  Intellectual and human capital will only grow in importance.

Seattle is already one of the more popular destinations for people looking nationally for where to live and have a career, and it’s not just because of our great weather.  They come because.  Seattle has a reputation as a forward looking, progressive city with a keen technological edge and an investment in the future.  Sequencing Seattle will enhance our reputation and strengthen our draw for the kinds of innovative, risk-taking, technologically-savvy people that keep a city and a culture from becoming as stale as a week-old  baguette.

And let’s not overlook the financial implications of a sequenced population. The incredible resource of a large population of sequenced individuals will draw in public and private funding from the government, non-profit agencies, and pharmaceutical companies.  ‘Applications by our researchers to the NIH will have an incredible advantage because the groundwork will already have been done for cutting edge genomic research.  Given the push for evidence based medical treatments following healthcare reform, pharmaceutical companies are under increasing pressure to tailor their new drugs towards defined responding subpopulations.  They could pay to have every person in their clinical trials sequenced.  Or they can choose to hold their clinical trials in a place like Seattle where that hurdle will have already been cleared.

Here’s another benefit:  cost savings for diagnostic and preventive medicine.  No one needs to be told how health care costs are rising.  Seattle can be on the forefront of taking genomic information and doing prospective studies on how that information can help guide medication, treatment, diagnosis, prognosis. We have one of the strongest concentrations in public health knowledge in the nation and can leverage that expertise to learn how genome information will make people healthier in the coming century.  In addition, in just one area, oncology, we are already seeing the benefits of rapid tumor genome sequencing to identify cancer-promoting mutations.  In this form of treatment, the baseline genome is compared to the tumor genome to look for changes.  Having the baseline genome already in the database speeds analysis and amortizes the cost.

Seattle can also be on the forefront of figuring out how can genome data be best used ethically and effectively.  This year the Presidential Commission for the Study of Bioethical Issues delivered their report on “Privacy and Progress in Whole Genome Sequencing.”  We can play a key role in leading and shaping the ethics of genome sequencing.

Last, I want to touch on what could possibly be the most amazing outcome:  citizen science, in which people are able to create their own research programs on the fly to answer questions.  Imagine that all participants are in a database with their own preferences on what kinds of queries (health, phenotype, behavior, etc.) that they’d be willing to participate in.  Someone comes up with a query and broadcasts it to the group:  “Is there a genetic element to coffee preferences?”  Anyone interested gets a text, chooses yes or no to participate, provides their feeling about vanilla lattes, and immediately the cohort is assembled, curated and QC’d by hard-coded heuristics developed by UW Genome Sciences and hosted on Amazon’s Web Services.

Within minutes, specific findings are reported back in whatever way you’ve selected.  At the same time, ghostwriting software automatically generates an academic paper that is immediately submitted to an open-source journal and posted online.  Maybe everyone contributes a dollar when they join in, to defray processing costs.  And that’s how you empower people to use this infrastructure to do novel science.


These might seem like great benefits, but I’ve already pointed out the real and immediate demands on Seattle’s budget.  How can a multimillion dollar project like this get off the ground? Let’s break it down.

As I mentioned, I fully believe the sequencing of a human genome will drop to between $1,000 and $100 over the next few years.  For purposes of making some back of the envelope calculations, let’s go for the middle point and say the average genome over the next five years will cost $500 per individual.  Seattle has a population of approximately 616,000 people.  But, we’re sequencing only those who want to have their genomes sequenced, so let’s say initial uptake is 25 percent. That means about150,000 people, multiplied by $500, for a total cost of $75 million, just for the sequencing.  The entire city budget for 2013 is $951 million.  Seems like a non-starter.

But.  This is where creative thinking comes in.  Today globally we have a few major providers of genome sequencing services.  Ask them:  what can you do for us?  Remember, prices are going down faster than Moore’s law.  BGI or  Illumina might be willing to cut a low deal, not just for the business, but for the privilege of being involved.  Or maybe a new player gets involved. The promise of having hundreds of thousands of genomes to process might spur a Covance or other outfit to make a huge investment in these technologies.

What if the genome sequencing is backloaded like a bad ARod contract, so the bulk happens in years 3-5, with the first couple of years devoted to pilots and infrastructure?  Maybe the budget for the first year is just a few million to assess the possibilities, find vendors, build a business case and line up stakeholders.  Maybe the second year is just $5-10M, which by then will buy 10,000 genomes to start.

Go to Amazon and Microsoft and Google and pitch them on being involved.  They know how to handle and analyze data, and might be willing to cut a deal in order to work with this kind of resource.  Prioritize diversity and outreach in the initial cohorts and use that as a lever to attract the Bill & Melinda Gates Foundation and PATH and Seattle Biomed and other public health institutions that are trying to help people in underdeveloped countries.  Information we learn about the impact of different ethnic backgrounds on health can be applied back to the populations from which our local groups originated.  Indeed, one of the main drawbacks of current genomics research is the overwhelming emphasis on European-derived (*cough* white *cough*) populations.

To Sum It Up

Think of this as laying fiberoptic cables in India.  As Thomas Friedman described in The World is Flat, the vast investment in infrastructure in developing countries during the dotcom boom is why you can now outsource reading your X-rays or putting together that patent application to professionals in India for a fraction of the cost of doing it locally.  Sequencing in and of itself is not the point; building a vast reservoir of data and encouraging a culture of innovation to do something with it is.  And doing it now.  This idea is already being talked up with larger organizations, such as the United Kingdom’s NICE (even if they may not be going about it the right way).  Seattle has the opportunity to do it the right way, and if we wait until everyone else is also doing it, there’s not much advantage to that.

Think of this also as a call to arms.  I see sequencing as the logical tool to keep our area vibrant and growing, but that’s because I’m a genome scientist.  Ask someone in 3D printing and she might say Seattle should buy everyone a 3D printer (hey!…)  The point is, the future of a city, more than ever, depends not on its natural resources and history, but rather on the quality and creativity of the thinking that happens there.  What gets quality and creativity going are visions of the future that inspire.

Drug Development: Let’s Play!

This post appeared originally in Xconomy on March 21, 2013.  The views expressed are my own and do not necessarily reflect those of Novo Nordisk.

My son is addicted to “Let’s Play” videos on YouTube.  You watch videos that take you through level after level of a video game, giving you a preview of what’s to come, or, in my case, a peek at a level I’ll never be skilled enough to reach.  Stupid Bowser.  But anyway.  This is just a small example of how games have permeated our lives.

Here’s another: late last year Boehringer Ingelheim made a splash by releasing a Facebook game, Syrum.  In the game, which combines aspects of trading card games and building games like Farmville, players try to develop drugs.  They can compete or collaborate with friends as they try to get their drugs to market.  A big question Boehringer Ingelheim faced, though, was Why?  It seemed incongrous for a Pharma company to put out a game. What was in it for them?

John Pugh, Boehringer’s Director of Digital, says it’s more than just PR and sees it as a platform that can expand beyond the initial iteration; it could create a new venue for conversation between Boehringer and its stakeholders.  He’s also suggested that “it becomes a problem solving platform, an educational platform and an engagement platform.”  Therein I think lies the real hope for Boehringer:  that by engaging enough people in the game the players will discover the strategy(ies) that will help pharma survive in an increasingly difficult and competitive business environment.

How would this work?  Take a step back and ask what games and game-like elements in the workplace are good for.  It’s already recognized that adding game-like elements to mundane tasks like training can increase participation, engagement and retention.  I just went though the most enjoyable health and safety training of my research career in which our trainer framed the exercise as a round of Jeopardy.  But people involved in Serious Games know there are more potential payoffs for adding game-like elements to a wide variety of industries.

Beyond training, there are three areas I see games as aiding drug discovery.  The first, and one that’s gotten a fair amount of attention over the past few years, is the use of research games like Foldit, eteRNA, and Phylo for biological discovery.  These games tap into the interests of tens of thousands of players to tackle real-life problems like protein- and RNA-folding, and DNA alignments.  They utilize elements like leaderboards, forums, feedback and a sense of purpose.  You can get bragging rights over your friends and help cure HIV!  These games are solving difficult problems in biology without the need for formal scientific training among its participants.  It’s not hard to see how companies facing problems like solving the structure of a potential target or optimizing the fold of a therapeutic siRNA could benefit from a collaboration with these research game designers.

The second area for games relates to Syrum–or what I suspect it’s being used for, anyway.  The information about how people play games may turn out to be an extremely rich vein of creativity and innovation.  As Andrew Phelps at the Rochester Institute of Technology has described, watching people play games demonstrates just how innovative people can get when faced with a constrained environment but a strong desire to accomplish an objective.  They’ll do things like repeatedly killing themselves in an adventure game so they can lay their bodies out to spell short messages to their friends when normal writing materials aren’t available (bodies often take a while to disappear, and you often re-enter a game at the same place you died).

I haven’t played Syrum yet–it’s Europe only right now, and also I’ve not yet fallen into the Facebook vortex. But given Syrum’s reported complexity, it sounds like Boehringer has added a lot of elements that reflect real challenges in drug development, discovery through launch.  I suspect Boehringer is storing every move made by every player–every alliance, every virtual hire, every step forward, sideways and backwards–and will mine that data continuously for strategies on how the process of drug development could be done better.  They’ll track the best players, and maybe even offer them jobs.  They’ll also continue tweaking the parameters.  Boehringer has said they want to launch different versions for different parts of the world.  I would bet some of the key variations will reflect the very different regulatory environments faced in different countries. Winners in one area may end up with very different strategies from winners in another.  So by mining the data, Boehringer also prepares itself for different scenarios.

There’s a reason the military invests heavily into various kinds of games and simulations. Military history is a stark reminder of the uncertainties of combat (after all, all it takes is one nail).  War games have been around for centuries.  Now, in an increasingly complex world, it’s even more important to simulate as many possibilities as is reasonable, to increase the odds that when the unplanned happens (and it will happen), the commander or soldier or chief executive or manager wiill have seen something like it before.  Drug developers (or any industry, really) are also subject to uncertainties, forces outside of their control and would benefit from a greater exploration of possibilities–the proverbial Black Swans–and how to react to them.  As an example, a recent article in the Financial Times describes some nice examples of how online adventure games are providing useful venues for observing and testing economic theories.

The last area where I see games as useful for drug development has to do more with behavioral psychology and the environment we live in.  Drug development benefits from the large number of scientists involved.  Not to generalize too much, but many of us are Geeks.  And, as described by Ken Denmead in his great Geek Dad books, a Geek is at that perfect intersection between Knowledgeability, Obsessiveness and (some) Social Skills.  Because of this, scientists tend to be smart, engaged in their work, and often willing to work far beyond normal working hours because it’s all just so darn interesting!  But still.  Having a laser-like focus on work takes a lot of mental energy.  Games can make that easier.

Many people are familiar with the concept of Flow, proposed by  Mihály Csíkszentmihályi.  The characteristics of Flow–engagement, satisfaction, positivity, optimal performance–coincidentally are many of the same characteristics one sees in people playing great games.  I would argue that by incorporating more games and game-like elements into our research, we will tap into a more efficient, engaged and productive workforce.

I can’t stress the engagement part enough.  We live in an age of endless distraction.  People are never out of internet contact.  Ever.  If they tell you they are, they’re lying.  Attention has become one of the most valuable commodities in the workplace.  Creating an environment that increases engagement through incorporating game-like elements raises a bulwark against distractions and makes a more efficient, focused and effective workforce.

And now if you’ll excuse me, I have to get my son to help–I mean, help my son get Bowser out of that castle…



Finding parallels between baseball and drug development

This piece was originally posted on Xconomy on March 1, 2013.  The views in it are my own and do not necessarily reflect the views of Novo Nordisk.

Consider a candidate.  Selecting that candidate takes thousands of hours of time and research–checking background, verifying data, assessing probabilities, projecting futures.  Once selected, more years of development follow, during which time the odds of success are less than 10%.  And if that candidate finally does make it, there’s just a small window of exclusivity before protection expires and that candidates goes out to the broader market.

I’m talking, of course, about baseball players.

So I’m a Mariners fan.  Have been since about 1999 (I moved to Seattle in 1996, so missed the big comeback year and it took me a little time to catch up).  And like all fans, I watch and hope, year after year, looking for signs of improvement, direction, some indication that there’s a plan. I keep looking towards some point in the not too distant future–let’s call it “next year” or even “year after next”–when I’ll once again be able to root for a winning team.

But while the Mariners may still be in what feels like eternal rebuilding, I’ve been able to find a silver lining in my fandom:  I’ve realized drug development seems to be learning from baseball.

Statistics, but the right ones.

There are a lot of parallels between the businesses of baseball and drug development.  Both involve long periods of development followed by limited periods of exclusivity for the product (drugs, players) being developed.  Resources (targets, talent) are rare.  Assets get traded or bought or sold.  There are the juggernauts and the mid-market and the small-market players.  And there’s the always-present need to keep doing more and finding better ways of winning, preferably with less.

One of the more fascinating developments in baseball has been the rise of a new statistical framework around the game.  Baseball has always been the most statistically conscious of sports, but it’s also been the most heavily invested in it’s own history and mythology.  Ken Burns is not making a 18 ½ hour documentary on Arena Football anytime soon.  That reverence for history means there has been a lot of resistance to new ideas.  For almost the entire modern era of baseball, certain statistics (ERA, W-L records, batting average, RBIs) have been the gold standard for performance.  Even though, when it comes down to it, they’re not really the best things to measure if you want to create a winning baseball team.

As Dave Cameron from Fangraphs has discussed on several occasions (like this one), statistics in baseball are how we figure out the the answers to questions.  We might be asking who’s the Most Valuable Player (*cough*Trout*cough*) or what kind of pitcher or hitter a given team should be trying to get through free agency, or whether a player can be expected to sustain his level of performance.  Some statistics like RBIs, venerated for years, are actually not that useful since they partially reflect circumstances outside a hitter’s control but are treated as a direct proxy for ability.  Albert Pujols would have trouble cracking 70 RBIs a year if he were batting 9th.

But okay, drug development.  Better statistics are making their way into drug development, exemplified by the increasing emphasis on Big Data.  Collaborations are getting larger and drug companies are trying hard to capture as much data as possible, whether it’s clinical, metabolic, transcriptional, genomic, proteomic or any other flavor that becomes possible.  However, the key will be figuring out which statistics, which measurements, are really relevant to the main questions drug companies want to answer:  why do people get sick and how can we figure out what a drug will actually do once it gets into the human body?  Are drug companies figuring this out?  And for the moment I’m leaving out biotech startups, since the current bar to taking advantage of Big Data is still beyond the reach of most small companies, at least right now.

In my view the answer is maybe.  The move towards biomarkers throughout the drug development pipeline is reassuring as it shows a realization that we need to measure outcomes more clearly and quickly.  There is also a greater recognition that bioinformatics is a key element of a drug development pipeline.  More importantly, there needs to be a recognition that specific outcomes (the game-winning RBI, the successful PhIII trial) aren’t necessarily justifications for the decision-making that came before.


Giving Richie Sexson a multiyear contract to join the Mariners in 2005 was a bad decision.  Advanced metrics had pegged his skillset as a poor fit for the Mariner’s home stadium, and his likelihood of sustaining his performance at that point in his career was low.  As it happened, he did fade away after a couple of years, but the key point is that even if he had performed reasonably throughout his contract, it would still have been a bad decision based on what we knew given our best tools at the time.  Pharma needs to develop those tools to not just gather more data, but figure out how to ask the right questions and trust what the data is saying..  However, it’s not clear that Pharma has reached its Moneyball moment.

Undervalued Assets

Which leads to another lesson from baseball: the under-appreciated asset.  Contrary to what some commentators have suggested, Moneyball wasn’t ultimately about Billy Beane deciding to draft only fat slow guys who could take a walk and get on base.  The real story was the concept of finding the market inefficiencies in Major League Baseball to get an edge.  Oakland plays in a lousy stadium with a putrid revenue stream and a snooty neighbor across the Bay who refuses to let Oakland move to San Jose.  In order to compete, their front office recognized it was necessary to find players and skill sets that were less valued by their competitor even though those skillsets were just as important to winning baseball games as more conventional talents.

During the year chronicled in Moneyball,  on base percentage (OBP) was batting average’s country mouse cousin.  Oakland’s insight was that batting average is just a proxy for not making outs, and in that respect OBP is a lot more important.  A player with a batting average of .300 who never walks makes an out 70% of the time.  But a player with a batting average of .250 and who walks 15% of the time only makes an out in 60% of his plate appearances. In baseball the single most important commodity is outs, of which you have but 27.  Oakland exploited this market inefficiency to get inexpensive guys who may not have always hit the ball, but still got on base.  Here’s the thing:  as market inefficiencies have shifted, so has Oakland’s strategy in player acquisition.

In drug development we can see some pharma searching for that kind of edge.  GSK and others are making a push into orphan diseases, turning an under-appreciated approach into what may become a glutted market to the latecomers.  How else might drug development search for an edge?  The logical answer is: every way it can.  As Jonah Keri described in his excellent book The Extra 2%,  Tampa Bay, another small market team in a lousy stadium deal has nevertheless managed to create a thriving, successful baseball club by taking advantage of every possible way it can compete, whether by taking on undervalued and risky assets or employing probably the most experimental and forward thinking manager in major league baseball.  If there is an advantage to be gained, Tampa Bay is exploring it.

Just recently, it’s been reported that Tampa Bay will face a reduction to their draft pool  budget in 2013 because they spent too much money on International Free Agents.  This might seem like a problem for a team that needs to build via their farm system because of limited revenue.  But it may actually be a calculated risk that they can get more for their money by overstepping MLB rules rather than committing too heavily into what’s thought to be an overall poor US draft cohort this year.  Drug development companies would be well advised to take this kind of approach and encourage a broad exploration of every way in which efficiencies might be gained, whether it’s in discovery, manufacturing, patient recruitment, therapeutic areas or technology.  And most importantly, companies need to set up mechanisms to broadly communicate the results, good and bad, and support and laud all of them, not just the ones that succeed.

Randomness and the unlucky bounce

Which brings me to a last insight from baseball.  Baseball is a probabilistic game.  The best players in the world get a hit three times out of ten.  Random factors, or at least factors so uncontrollable as to essentially behave like random factors, can influence the outcome of a game.  Just ask the Chicago Cubs.  In Moneyball, Billy Beane is described as saying, “My s*** doesn’t work in the playoffs,” by which he meant that the team he built was designed to perform above average, on average, over the course of a 162 game baseball season.  But the playoffs are fickle and any team can beat any other team in a best of 5 or best of 7 series.

We don’t appreciate how much randomness affects everything.  In The Drunkard’s Walk Leonard Mlodinow provides ample evidence about how little we really control everything around us, even though we might think we do.  He also shows the poor grasp people have on probabilities.  In baseball the probabilities are made manifest in the statistics we track, and maybe that’s helped drive the adoption, finally, of better statistical tools.

Baseball is full of random happenings.  Adam Dunn hit 40 home runs a year (more or less)  like a metronome for seven years, and then in 2011 was completely lost at the plate.  And then in 2012 he hit 41.  Drug development is full of randomness too.

Drug development sometimes seems to show a much more deterministic mindset.  I blame the successes of the 80s, when a whole raft of wonderful drugs entered the market, and lulled people into a sense that this kind of productivity could go on forever.  Pipelines came to be viewed by companies and analysts alike as though they were treadmills steadily pushing new drugs forward, as though making drugs was like manufacturing widgets.  And yet, even companies that create widgets (albeit very large and complex widgets) have problems meeting their deadlines and come against unexpected issues.  How much more uncertain is drug development, which deals with trying to figure out how biology works?

What lesson can drug development companies take?  Here there is one important difference:  even a failing baseball team often makes money, whereas a failing pharma company faces being bought or imploding.  On the other hand, poorly performing franchises in the Major Leagues have been threatened with being shut down, or at least moved, so perhaps there are still some parallels there.  One key learning is the value of stability.  Over-reaction to poor results can be deadly to the long-term health of a ballclub, or a company, as it can lead to the loss of talent due to mis-assignment of blame.  Another key point is diversification of revenue streams.  Some of the best positioned ballclubs are there because they have worked hard to increase revenue beyond box-office sales and the occasional t-shirt purchase.  Similarly, some of the best positioned Pharma companies are diversified players like Roche and Johnson and Johnson.

Maybe the most important lesson is to realize in a random world there is no way to guarantee success in drug development, and therefore, the goal is to set up the best processes, with clear measurements and benchmarks; to evaluate constantly but to intervene rarely; to work on increasing the probability of success.  The 2001 Mariners won 116 games and still didn’t even make the World Series.  And yet, few question that they were the best team that year by far.  The goal for the Mariners after that season was to evaluate how they got there, try to separate luck from skill, and attempt to replicate those elements that were under the control of the players and the front office.  That could be the approach taken in drug development as well.

Baseball, or the Movie Industry, or Oil exploration, or…

I’d love to delve into other concepts, like Value Over Replacement Player (VORP) and how we might apply that to drugs and scientists, but that could be a thought for another day.  As the drug development industry continues its struggle with how to carve out its future (because, you know, eventually there won’t be any more companies left to buy), it seems potentially fruitful to try and learn from other industries that have been faced with similar challenges.