As a senior majoring in behavioral neuroscience, I am required to complete a scientific research project. This summer, I completed my research with an experiment looking at the capacity of a drug to treat traumatic brain injury. As necessary for such a project, I used rats as my research model. You guessed it: that involves giving rats brain injuries. When I describe this aspect of my research to others, the reactions range from the usual dismay on behalf of those poor rats to the occasional bemused inquiry as to where one gets brain-injured rats.
Animal research is (and has been for a long time) an essential part of the scientific process. If you want to study how a disease progresses, the effects of brain injury, the effects of a drug, or even some aspect of behavior, you have two options. Study it in animals or study it in humans.
Human trials are usable for some things. Take drugs that have already been shown to be relatively safe, or noninvasive behavioral studies. But there are a lot of things you can’t do with humans- and they’ll put you in jail if you try. So for everything else, there’s animal research. And while it’s sometimes distasteful, it really is the best option.
Lest you think I am some cruel animal-abusing taskmaster, think about it this way: by and large, animals in the laboratory have a much better life than animals in the wild. They’re in a secure, temperature-controlled environment with consistent supplies of food and water. Animals in the wild live in natural states of fear and near-starvation, always searching for the next meal while attempting to evade predators. Laboratory animals already have it better than 90 percent of the animals on the planet.
We scientists hardly have free reign to do anything we want with them either. All research institutions in the US utilizing animals for research are required to submit to the oversight of institutional review panels, which evaluate projects based on the perceived suffering of the animals, the necessity of the procedures, and the scientific or medical benefit from the research. Here at Washington College, that’s the IACUC, the Institutional Animal Care and Use Committee.
Applicants for animal projects are required to search scientific literature for alternatives to animal research, such as invitro models using cell cultures. If such alternatives exist, and are of sufficient quality to provide valid results, there is a very good chance that one’s application to use animals will be rejected. If the study is authorized, researchers must take steps to ameliorate any suffering the animals experience, as long as doing so would not interfere with the study.
Had alternatives to live animals existed, I would gladly have used them in my study. But the fact is, there is no system that can approximate the chemistry and interconnection of a living brain. In order for my data to have the slightest degree of applicability to the outside world, animal models were a requirement.
Besides, what happens when you don’t do animal testing? Look at the thalidomide crisis in the ‘50s and ‘60s. No animal testing was done, and had it been, it would have averted thousands of cases of birth defects. Animals across the world live and die every day, and animals in the laboratory are no different. But they are different in that they are contributing to medical and scientific progress. It is no exaggeration to say that nearly every major medical breakthrough in the past century has depended in large part on animal research.
So what do I think? I think animal research is a necessary evil. It’s not an ideal tool, but it’s the best one we have, and it does more good than not. Plus, living without it pretty much means eschewing most of modern medicine, so good luck with that.