Transcript for Dr Phil Reeves

Regulatory implications of new methods on animal health drugs approvals and risk assessments

This presentation was delivered at the APVMA’s science feature session on 15 October 2015. The full video is available on our YouTube channel.

My brief today is to speak about the regulatory implications of new methodologies and basically what I want to do is run through the topics that I have got up here. My background is I have a pretty comprehensive background in toxicology as a postdoc research fellow but I have never actually worked in government regulation in the toxicology area so I just want to make that clear up front. When I was preparing this presentation, what I particularly wanted to do was figure out what the drivers were that were advancing toxicology and we're going to move through those and look at the toxicology in the 21st Century. Some of the international projects, and we have already heard about many of these with what Jim has spoken about just a moment ago. Where are these advances? We will look at that. The other point is emerging technologies are forever coming over the top and we have already heard about nanotechnology which is what Jim just talked about. Then finally I will take a look at the challenges and the regulatory implications.

The major vision for the 21st Century with respect to toxicology is really moving from an observational science to a predictive science. We heard Jim many times refer to predictions. This is very much at the focus of regulatory toxicology into the future but of course, with that you have to look at the pros and cons, what can be achieved in a relatively good timeframe and what cannot. The image that I have got here simply shows that a lot of the findings that we get are actually taken from laboratory animal species. Then they are extrapolated across to humans. Of course, there are various effects on tissues, whether that is discovered through visual observation or through microscopy or histology or biochemistry. What we are particularly interested in, and I hope you can see this, is where is all of this leading into the 21st Century?

What I planned to do is simply look at two major projects, one in the US and the other in the EU. You have already seen the diagram here, Jim just had it up a moment ago on one of his slides. Basically we can see front and centre is this toxicity testing, toxicity pathways and targeted testing. Of course, it's all tied back to dose response curves. What I plan to do is simply look at the approaches that the US and the EU have taken and just from that try and identify where some of these emerging issues are, some of the advances in regulatory toxicology.

I will start by looking at the Tox21 Project, which is a US project. The goals of the Tox21 is really collecting a whole lot of biochemical and cellular pathway information from humans. In the process there is a lot of design to be done and a huge amount of endpoints and things to be identified so it is really a massive project. You can see from the partners that are involved with this, NTP, EPA, FDA, NCATS et cetera, and they are all playing different roles and they are all contributing to an enormous project. I have actually learnt since we have been hosting Jim and Nancy's visit that Jim has actually played quite a significant role in some of the very, very early work to do with some of this stuff.

Toxicology in the 21st Century

It is actually broken into the phases that I have shown here. On the previous slide it showed that NCATS was very much responsible for the high throughput robotic screening system, which is less expensive and faster than traditional toxicology models. We have got a couple of issues here. The first big issue of course is the use of in vivo studies and the use of laboratory animals for doing those studies. Already there has been a ban placed on the use of in vivo studies in the EU for cosmetic toxicology research in 2013. There is a concerted effort to reduce the amount of in vivo testing. Of course, there is always going to be a role for in vivo testing but that comes following a screening process, which we are going to look at.

The second thing is sheer expense and I will show you a few methods a little bit later whereby the expense of some of the new paradigm, if you like, of toxicology testing is much less expensive than the current situation. Much bigger capability and so we can actually do very large numbers of toxicity tests in chemicals. I should point out, what I'm talking about is not necessarily small numbers of chemicals such as may occur in some of the areas that we deal with such as particularly veterinary medicines. There are environmental pesticides, for example, and we're talking about thousands and thousands and thousands of them. The question you have to ask is if you were doing in vivo animal testing, just how long would that take? Is it plausible?

The other thing about the Tox21 is the results are actually submitted to the National Library of Medicine's PUBCHEM website and so they are actually publically available. The transparency and the ability to use those data in other projects is very, very good indeed.

One of the US departments that I had on an earlier slide was the EPA. Their responsibility or their role I suppose in this Tox21 program relates very much to computational research. They are looking at advances in biology, biotech, chemistry, computer science and trying to integrate these things, draw them all together, and then they are looking at the biological processes that can be disrupted by test chemicals and some of the biological disruptions that result when you have doses that are expected with human exposure.

As far as the ToxCast program is concerned that EPA is running, it is really the two‑step process that I have up here. Exposing cells and proteins to chemicals using automated screening technologies and also the screening of cells or proteins for changes in the activities. As I have said, the numbers of chemicals are absolutely enormous.

The second project was the very large project in the EU. This is a program that actually kicked off in about 2011. At that stage it had a budget of €50 million. It comprises six components. The first three of those are shown here. One is investigating the use of stem cells. This, of course, is also well entrenched in the US program and I will come back to that in a moment. What it is doing is getting around this issue of trying to extrapolate information from laboratory animals to humans. Some of the extrapolations can be based on very high doses, which are administered to the animals, and so the use of stem cells allows all of that to be brought back into the proper proportions.

Another aspect, another project relates to hepatic microfluidic bioreactors. This relates again very much to moving away from the use of in vivo studies towards in vitro studies. What they are particularly doing is using stem cells to make, to represent liver, if you like, for liver metabolism. By doing this technology here, what is happening is it is possible to actually figure out the metabolism of various chemicals in an artificial situation, in an in vitro situation which is very representative of the in vivo situation.

A difficulty with these is certainly detecting endpoints because I will show you a slide in a moment and how many endpoints are there and how do you distinguish them? How do you characterise them and validate them and so forth. That is a really big point.

The EU program is also looking at these three projects. The in silico models is very much about cosmetics and that's where the focus came from. It is really concerned with delivering integrated, a suite of computation tools to predict some of the effects of long‑term exposure because a lot of these sorts of effects cannot be done without the use of animals at the current time, so what's happening is they are developing an in vitro system to allow that to take place.

The second point relates very much to developing better in vitro systems, and we are talking about organotypic cultures. There are many different forms but basically they represent 3D structures. If you were to take something like skin under the convention system the skin cells in culture would quickly revert and lose a lot of their functionalities. Whereas with an organotypic culture with the 3D system there is a proper matrix. Skin could go on growing hair whereas with the conventional system at the present time that would not happen.

Finally, the third dot point is ToxBank. It is very much about establishing a dedicated web‑based data warehouse, establishing a database for the test compounds and establishing a repository for the selected test compounds and then setting it up so this in vitro toxicity testing becomes very transparent for everybody to follow.

Based on what I have said about the big projects in the US and the EU, we end up with a pretty good idea of what toxicology in the 21st Century is moving towards. The big advances would appear to be in cellular and molecular biology and in computational science. In cellular and molecular biology, the first point is really about the use of these induced pluripotent stem cells. That sort of comes back to the whole point of using human cells rather than laboratory animal cells so that the metabolism and the various things which lead to the toxicology are better characterised.

High throughput screening, it is very, very obvious because we can have hundreds of thousands of these assays done in a day whereas previously you couldn't achieve those sorts of numbers in a very long time. Some of these assays may be, 500 or 700 assays on one particular chemical can be done and they can be done at something like 15 different concentrations. Because this is all robotic, basically using plates such as the 96‑well plates that we have always used, huge numbers can be actually pushed through this system.

Toxicity pathways, again, I've already mentioned that endpoints and biomarkers have to be very well characterised so that it is very, very good when everything is working well and once it has all been characterised. Omics technologies. This has been around, of course, for decades and I know people have been using this for a long time. I'll just very quickly run through that in a moment. Computational science, following Jim's talk I don't think I really have to say anything about computational science. The bioinformatics is really about storing huge amounts of data. It is complex data, it has to be integrated with the biology. Modelling is involved and so it's a very big process that could never be achieved without modern computers.

Out of these points the first one I wanted to touch on very briefly was the induced pluripotent stem cells. We always thought that cell differentiation was a unidirectional process until Gurdon came along in about 1960 and did experiments on frogs and tadpoles and so forth and got some initial information about the fact that these cells could actually be reverted and then moved forward. In other words, the differentiation was not a uni‑directional process. Then it was followed up by Yamanaka. In about 2006 he published for the first time that he was able to make this process here run. Basically what happens is you can take skin fibroblasts, you can reprogram these and Yamanaka actually used four factors to reprogram, two of which were oncagenes which can cause incidences of cancer, of course. There have been refinements since to change those factors and there has been quite a lot of process done in that regard.

Once you have the induced pluripotent stem cell that can then be differentiated into things like neurones or blood cells or any other tissue in the body for that matter. Where this is really important is in regenerative medicine. If a patient is suffering from dementia, for example, it is not possible to administer some of the drugs and trial drugs in a person, whereas using this process, all of those drugs can actually be trialled in vitro. Then, of course, the results can be used in vivo in that particular patient.

I mentioned high‑throughput screening. This slide here is one that you will find on the web. It is from NCATS. This is the sort of robotic high‑throughput screening that can be looking at thousands or even hundreds of thousands of toxicity tests in a day. It is quite remarkable, the decreased costs, decreased animal usage. Moving away from use of animals. The focus is on perturbations of toxicity pathways in humans.

This is the toxicity pathways that I have been referring to and we can see that up here we have receptors, various pathway regulations and different technologies to sort of understand what is actually going on, cellular processes and then coming right down to the tissue organ so eventually you have got your toxicity endpoint. Really what toxicity pathways is all about is trying to identify what is going on up in these areas here. It is shown here. When you add the chemical you can see that there is some perturbation here and that sort of flows all the way through. The question that I posed before was, how many of these toxicity pathways are there? Just in this diagram here I've just shown a couple to get the point across but it could be extremely extensive. They are starting off with looking at things such as nuclear receptors and many of the more common ones but just how many are there? It's going to be a very ongoing process.

Genomics technologies

Of course omics can refer to the DNA, the genetics or at the messenger RNA level or at the ribosomal level, and it can even refer down further into the lipids, the carbohydrates and so forth. Certainly metabolites because that's a very, very important part of it. In terms of toxicogenomics this is probably the most simple case you could find. What you have is the control cell and the test cell, chemical is being added here. At the other end of the process you have prepared a DNA microarray. This has got the probe DNA on it.

What is happening back at this level is you are extracting the mRNA, the messenger RNA, amplifying it up with polymerase chain reaction and reverse transcriptase to bring it back to cDNA. Then you fluorescently label these strands of DNA so that those that have come out of the control cell can be labelled fluorescently one colour and those out of the test cell a different colour. Combine equal amounts, hybridise or just sort of link them up with the probe DNA. What we see here, this is simply showing genetic expression so it is either, if there was no change and both of these were expressed at the same amount, then the reds and the greens would balance out and so we've got a yellow spot on the DNA microarray. If, on the other hand, the chemical has caused down‑regulation, then obviously the control cell will appear green. Ditto if this one, the chemical up‑regulates, then the balance is going to be in favour of the test DNA and so the cell would appear red.

This slide on computational toxicology is the one that Jim just showed a moment ago or at least it's a cut from two with the coronas forming around a nanoparticle. He is making the very good point that the rate of formation of the corona has to be taken into account with the transport time or the distribution time and the various modelling has taken place there.

Bioinformatics

I have already mentioned that the whole purpose of bioinformatics is to deal with large, very large amounts of complex data can involve modelling, integration of the data analysis, pathway‑based toxicity testing using some of these in vitro approaches. Clearly, if you've got hundreds of thousands of these toxicity tests being done in a very short period of time, you need this sort of system.

Nanotechnology

Over the top of all this comes some of the emerging technologies and the one I have selected here is nanotechnology. There are a great deal of things that are not known about nanotoxicology. Jim has alluded to those. Nancy has edited this book on nanotoxicology but there are so many questions that we have to ask about nanotoxicology that may not have applied to conventional toxicology for conventional chemicals.

Andrew Maynard and his co‑workers have written this particular article and the question he is asking, of course, is how are we going to regulate these sophisticated materials in 50 years time? Because what we are seeing at the moment is the very beginning of nanotechnology and it throws up all these questions and many, many more so what is going to happen when the materials become much more sophisticated?

I'll close off now. From the point of view of the challenges, the ones that really jump out are designing, developing a suite of toxicity pathway assays. We have just looked at the toxicity pathways. The question is how many are there, how do you identify them, how do you validate them, are they meaningful, et cetera. Using new technologies in the targeted testing strategy. High throughput screening. We've talked about the omics technologies. Then you have got to coordinate all of this so you can measure all sorts of things, but are they actually meaningful? You have to be able to coordinate what you have actually measured versus the biological significance of that.

The regulatory implications, all of these things have regulatory implications. We are getting into a whole bunch of new technologies and systems. Toxicologists, whether they be in industry or research or regulatory agencies all have to learn about these new toxicity‑testing paradigms. There are new tools, there are new methods.

Key messages

Just to finish off, the key messages as I see them, new approaches capable of revolutionising toxicology are being built based particularly on the advances in cellular and molecular biology but also computation sciences. The evaluation and validation of these new methods is going to take a long, long time. It's going to be difficult with some of them. There is going to require this interdisciplinary science interaction. We are talking here about things such as biology, computer science, statistics, I haven't touched on epidemiology but there are a whole bunch of these scientific disciplines that will need to be brought together and coordinated.

Errors and omissions excepted; check against delivery. 

To protect your privacy, please do not include contact information in your feedback. If you would like a response, please contact us.