science commentary jun 10, 2012; updated dec 21, 2012 |
ou might not be inclined to feel sympathetic toward the pharmaceutical industry these days, with their idiotic TV commercials about "cures" for improbable diseases, and all the bad press about new drugs and the unexpected side effects that they somehow missed during testing. But the pharma industry is a huge producer and consumer of biomedical knowledge. They create products that treat problems ranging from diaper rash to bad breath, and they're a major contributor to the country's economic well-being (or what's left of it).
For years the industry has been under attack by people who think they should give their products, some of which happen to be lifesaving drugs, to everyone for a nominal cost. This idea has spread to third-world countries, some of which have publicly stated their intention to steal the rights to drugs patented in the USA and Europe, undermining the industry still further. Once the big companies are gone, these countries can forget about new cures for the diseases their people still suffer from. But they don't care.
Probably its biggest problem is the patent system. The government issues patents that are valid for twenty years. At the same time, the same government requires exhaustive testing and FDA approvals, and mountains of paperwork proving the drugs are safe. This can take ten years or longer—and all the while, the patent clock keeps ticking. This adds hundreds of millions to the price of developing a drug. When the patent runs out, the drug goes generic, and revenue slows to a trickle. So the industry raises prices to compensate, and incurs the wrath of consumer groups. It also forces the industry to drop promising drugs that are taking too long or that don't benefit enough people.
As a result, several companies are facing what is called a "patent cliff" because patents for their existing drugs are due to expire, and they have nothing to replace them with. Love them or hate them, they produce a big part of America's wealth. But they're in trouble. Within a couple decades, we could lose one of our most important remaining industries.
Another problem is the industry itself. Companies are pouring billions into drugs that have little chance of curing anything. (I could name the drugs, but most people in the industry know what they are. Anyway, a long list of these bizarre names (which are actually selected by a special group called the USAN Council) would be pretty boring).
This happens because corporate R&D is much more risk-averse than it needs to be. In industry, the costs of being wrong are higher than the benefits of being right. So they spend years validating every little analytical test, proving that they work in every conceivable situation, while academics just plow forward. As a result, their hypotheses about how a particular disease works are often outdated. This leads to billions spent testing drugs that address a problem that science has shown is not the real cause of the disease.
I know this because part of my job as a biomedical researcher consists of discovering new drugs and new biochemical pathways (which the industry calls "targets"). In industry, more often than not, the targets are chosen by somebody up in management who thinks they understand how to cure a particular disease. For some companies, this works. For others, the strategy for success just to do the exact same thing everybody else is doing, only push your employees harder. The real goal is not so much to cure a disease, or make a profit, but to avoid the risk of being wrong if they follow their own ideas. It is a classic case of what social psychologists call "loss aversion."
Often this happens because the manager is a political appointee who knows little or nothing about technology. The idea is that it doesn't matter whether the company is making DNA chips, computer chips or potato chips. The employees are the ones that have all the knowledge. Unfortunately, if they don't understand the technology, executives have no way of knowing if their ideas make any sense. An upper management that lacks research experience also lacks vision, and they will make decisions that can have catastrophic consequences.
Risk aversion also induces lemming-like fad-following behavior. Many of the latest fads are managerial, and most of them boil down to one basic idea: the best way to become profitable is to get rid of the staff. To a beancounter, it is simple arithmetic. If a company has $2 billion of revenue and has personnel expenses of $1 billion, they can double their profit by firing all the staff. Or they replace the staff with yes-men, who can't think for themselves and are therefore easier to control. Alas, the record of diseases cured by yes-men, as enjoyable as their presence may be to the boss, is woefully short.
One fad was to rely on the universities to provide the innovation. Industry would just scoop up the discoveries in the research literature, convert them into products, and get rich. But this was a fundamental misunderstanding of the purpose of the research literature. It's full of false starts and missteps. Industry scientists blamed academics when they found many of their findings to be unreliable. But academics weren't to blame. They have their own bean-counters, lawyers, diversity managers, and an assortment of other parasitical beings to deal with. There's not one academic who likes being evaluated on the basis of the number of publications per year. But counting things is all the bean-counters in charge know how to do.
(Continued at right)
While they may be more flexible, the universities and small biotechs just don't have the resources to make big discoveries like they used to. Academics are suffering the most, now that grant funding rates are approaching ten percent. Government funding for academic research will only decrease as non-discretionary spending takes a bigger and bigger portion of the Federal budget.
In recent years, there are also fewer small biotechs left (mostly because they've either been bought up already, or else run into the ground by their own managers). The venture capitalists that funded these biotechs finally realized that many of the people running them have absolutely no idea what they're doing. VC funding has become fiendishly hard to get.
Another industry fad was to cut R&D. To keep the pipeline flowing, they would just buy smaller drug companies and biotechs. Unfortunately, that didn't work either, because as soon as the new employees showed up on the payroll, the bean-counters saw them as a cost that could be cut to increase profits even more. So we saw a repeating pattern: a big pharma buys a smaller one, then fires all the employees of the smaller company. Then management wonders why they're falling off a patent cliff.
The latest fad is outsourcing R&D to China, as one big company has done. Others are considering it. It's bad enough to outsource your labor. When you outsource your R&D, you're also outsourcing your country's infrastructure, expertise, and entrepreneurial spirit. You're outsourcing your country's future.
The only demonstrable effect of these fads was to flood the job market with unemployed industry scientists, who could then be hired back at greatly reduced salaries. It almost makes economic sense, except for the fact that it may have cost 20-40 billion to acquire the smaller company in the first place. The only ones who became rich are the executives and the shareholders who had enough sense to short the company.
The pharmaceutical industry needs to find a way to abandon these fads and do more creative discovery-related basic research. To achieve this, they have to treat their staff more like faculty instead of employees. It means changing the corporate culture to encourage innovation, exploration, and risk-taking. They need to get rid of the yes-men and hire more troublemakers.*
It won't be easy. Brian D. Smith, the author of The Future of Pharma, believes that ever-increasing risk aversion is a long-term change in society in general that could ultimately lead to commoditization of the pharmaceutical industry. The blizzard of lawsuits against chemical and pharma companies who market products later found to cause health problems is a major deterrent to innovation. Smith quotes Sir Richard Sykes, former chairman of GlaxoSmithKline, as saying:
"Today, regulatory authorities are totally risk-averse and it is almost impossible to bring some molecules to market."
Now that universities and non-profits have started to patent everything that comes out of their labs, people have the right to ask how they're different from any other big corporation. If the primary goal of a university is to make a profit, why should they receive taxpayer money? There are good arguments for and against on this issue. No matter what happens, as academic research continues to deteriorate, industry will need to pick up the slack. If they can't, they'll eventually run out of new products altogether, and we will all be a little poorer—and a little less healthy.
*Update (Oct 22, 2012) Okay, why hire troublemakers? They're the ones who think outside the box. They can see when you're making a mistake, and aren't afraid to tell you, tactfulness be damned. If anyone is going to cure cancer or create a new blockbuster drug, it won't be an organization man. It will be someone who does things they're not supposed to do—someone who has strong opinions and is not afraid to take risks by expressing them.