Breakthroughs Might Mean the End of Animal Testing

Public support for animal testing has been in steady decline since the 1950s, dropping from above 90 percent in 1949 to only 57 percent in 2013. And that number is likely to fall even further, with younger demographics opposing animal experimentation in even greater numbers than previous generations.  
 
For many laboratory scientists, this waning approval isn’t cause for concern, because a near-identical trend is emerging within the research community, and more and more U.S. labs are using innovative cross-disciplinary technologies to spare at least some of the 25 million animals used for research annually.
 
But while compassion and ethics are indeed factors, the new paradigm is actually driven by a striving for improvement that is a hallmark of the best science. In 2014, the limitations of animal testing appear to have caught up with research and development, leading many to question whether the practice is still relevant.
 
The most obvious problem is the fundamental biological difference between humans and the animals used in research. The inner workings of rat and human may be similar, but they are by no means identical. When it comes to drug discovery and development, these limitations can jeopardize every segment of the pharmaceutical pipeline, from synthesis to prescription.
 
Side effects are missed, and millions of dollars are wasted. Even if a new chemical entity is deemed safe at the animal stage, it still only has an 8 percent chance of being approved for human use.