This content is current only at the time of printing. This document was printed on 28 May 2020. A current copy is located at https://apvma.gov.au/node/360
You are here
Preparing an efficacy and safety study report
Applicants planning and conducting efficacy and safety trials should consider how the trial data will be collected, documented and prepared for presentation to the APVMA to support an application to register or vary a product. This guideline provides basic advice about the presentation of study reports that may be included in an efficacy dossier. You may use this advice and accompanying template, but are also free to use any alternative format, as long as the report can be easily reviewed and understood by a reviewer.
This guideline should be read in conjunction with other APVMA guidelines, including any relevant product-specific guidelines and the guidelines on preparing an efficacy dossier and experimental design and analysis. For example particular features of a trial such as sampling parameters and test protocols may be required to be included and reported for certain types of products. See ‘Demonstrating efficacy of pool and spa sanitisers’ for more information.
Further details about each section of the report template are also provided to assist applicants preparing reports.
Figure 1: Study report template
|Quality assurance statement|
|Study summary/Executive summary/Abstract|
|Methods and materials Study particulars Experimental design Treatments Assessment methods|
|Conclusions and discussions|
Guidance on report features
Include an identifying study or report number that is unique to the report.
Provide the study or report title and the names of the primary author(s), editors and any other people involved, and details of their roles in the study. Include the date that studies or trials were conducted and the date the report was finalised.
Study summary, executive summary or abstract
Provide a brief summary of the investigation, including a statement of the objective, general methods, results and conclusions. If the study was conducted for the purpose of registering a product in Australia, it should include reference to how the results support the product’s efficacy and proposed label instructions for use. Any constraints, such as economic, meteorological or animal welfare issues, should also be detailed.
Scope, introduction and background
The purpose of the study should be briefly stated to make it clear and set the scene for the reviewer. If the study is introducing new technology, methods, other unusual matters or different designs, include a brief explanation in the introduction.. Alternatively, if the particular design has been used in previous APVMA applications, provide some background on this for the reviewer, who might not be aware of prior assessments. The efficacy reviewer can only assess what is in the dossier provided, so you should include all relevant aspects of the trial and any industry information that will be needed to understand the conduct of the trial and the proposed label claims.
Methods and materials
Explain how the trial was conducted. State if the trial followed protocols previously agreed, or published methodology such as APVMA or internationally accepted guidelines. Detailed or lengthy trial methodology can be included as an appendix to the main report.
Details of the research establishment, the study coordinator, other personnel, the exact location and the dates of commencement and completion of the trial must be recorded and documented in the submission. Provide scientific (Latin binomial) names for any crops, pests, weeds or other organisms discussed. Where necessary, provide relevant meteorological information (rainfall, temperature, humidity, etc.) and agronomic details (soil type, cultivars, crop rotations, etc.) for both local and international studies. These details could be presented in table format with each study or included as an appendix.
Full details of the experimental design (for example, small plot, replicated or randomised complete block design), numbers of replicates and plot sizes should be provided for each study. The following will provide useful information for the reviewer:
- definition of the null (for example, no difference between treatments or products) and alternative hypotheses (for example, the proposed new product is better than the existing product) and the appropriate level of statistical significance for rejecting or accepting the hypothesis (for example, p < 0.05)
- methodology for the statistical or biometrical analysis proposed, statistical justification for numbers of animals or groups used (including the need to replicate), statistical power and confidence level of the data
- standard operating procedures that are study-specific and other records, which may be appended to the submission
- details of controls used as comparisons in the statistical analysis (it is important for the reviewer to be assured that controls have been exposed to similar environmental conditions and study conditions).
For further details on experimental design, refer to the Efficacy experimental design and analysis guideline.
Give details of the test product(s) and of any standards or reference products (controls) included in the studies (such as the product name or experimental code, active constituent name and content). This should also include the active constituent concentration and/or formulation details allr products used in the trial.
Provide full details of all treatment applications. It is important that the treatments are consistent with current agricultural practice. Include information such as:
- application rate in terms of product per unit or active constituent per unit (including per unit canopy row)
- date of application
- frequency of applications and application intervals
- details of any other products used, either separately or in combination
- carrier volume (L/ha) and type (for example, diesel or water)
- type of application equipment
- other application details, such as nozzle type, spray quality, pressure (kPa, bar) and boom height
- crop growth stage at application and crop part treated
- pest population or developmental stage or infestation level at time of application.
If you use an experimental code, explain how the coded product relates to the proposed product. If more than one formulation has been used, give full formulation details and, where relevant, details of bridging or bioequivalence studies.
All assessment methods should be documented, explained and, where relevant, appropriately referenced.
Include information such as:
- date or interval after application, or interval prior to harvest
- for crop uses, crop growth stage key (for example, Zadok’s scale)
- efficacy assessment method (for example, European Weed Research Council rating scale, or number of individuals in the monitoring area counted per 30 seconds)
- sample size (such as 20 tillers per plot or 5 soil cores per plot)
- sample method (for example, random or fixed)
- harvest date
- harvest method (for example, small plot combine)
- harvested commodity measurements (such as weight and moisture content)
- expression of yield (for example, tonne per hectare at 85 per cent dry matter)
- explanation of any abbreviations used.
For crop protection products, you should provide yield data for all treatments (including standards and controls) to demonstrate the effect of the product on yield. A justification should be provided if yield data are not included.
All relevant data should be presented and reported. Providing results from a ‘typical sample’ is not acceptable. When only means, percentages or other summaries of results are given in the results section of a report, all ‘raw’ data should be provided in an appendix.
If negative or unusual results have been recorded, they must be included together with a discussion (below) about how or why they may have occurred.
You should conduct a statistical analysis of the results whenever relevant or if you are required to demonstrate differences or equivalence between treatments.
Provide full details of the statistical methods used, including a justification or validation for the method chosen. Include information about how the underlying assumptions of the statistical method have been met in the justification for the method selection. Include any reasons for not carrying out a statistical analysis if you have not done one. The results of the statistical analysis (for example, degrees of freedom, F-values and p values) should be presented in table format with each study. If there are many such analyses, they could be provided as an appendix. Novel statistical analysis submitted in support of experimental data should be accompanied by the raw data and the published literature that references the statistical technique.
If a study produces many separate results (for example, from different treatments and assessment times), the data are best presented in a table or matrix format. This allows a quick and easy comparison of results.
If graphs or other methods of presentation are used, they should be appropriately labelled with measurement details, including the relevant units. The type of presentation of results should be similar to that expected in a peer-reviewed journal. Original or raw data should be included and may be submitted as an Appendix to the report.
Any abbreviations or indications of statistically significant results used in a table must be explained as notes to the table. Examples of useful table formats are shown below in Table 1 and Table 2.
|Treatment||Result prior to treatment (unit of measurement)||Result after treatment (unit of measurement)|
|Product A, rate 1||11a||5b|
|Product A, rate 2||13a||1a|
|No treatment control||12a||11c|
Note: Results with different letters within a column denote a statistically significant difference.
|Treatment||Measurement of two related results
(for example, % control/g weight loss)
|Day 0||1 DAT||7 DAT||28 DAT|
DAT = days after treatment.
For further examples of how to tabulate results, refer to Regulatory Directive DIR 2003-04, Efficacy guidelines for plant protection products, Pest Management Regulatory Authority, Health Canada.
The report should include text that highlights the important results or significant outliers, including any significant differences that are important to testing the study hypothesis. Descriptions of results should be concise, and the discussion or interpretation of the results should be left to the discussions and conclusions section (see discussions and conclusions).
Discussion and conclusions
Each trial should be appropriately analysed, the results interpreted and a relevant conclusion about the purpose and hypotheses of the study stated. Conclusions must be clear, specific and, wherever possible, related to the proposed use of the product as instructed on the product label (for example: ‘The results of the trial support a label use rate of x g/100 L when applied via ground application equipment; another application may be required after 14 days if monitoring indicates pest numbers will exceed thresholds, etc.’). Do not present conclusions that are not relevant to the proposed label claims.
Discuss any unusual or unexpected results and if possible, explain how they occurred. It is not unusual for an efficacy dossier to include trial reports in which there is considerable variability in trial results. Issues such as unusually low or high pest abundance or other confounding factors (for example, weather effects or soil types) should need to be discussed.
Variability of results may lead to label instructions that advise users of the possibility of variable results and how they may avoid them. This section could also discuss the integration of the proposed product with current pest management practices.
This section could also discuss the integration of the proposed product with current pest management practices including economic action thresholds for the proposed labelled pest or pathogen to evaluate the likelihood of repeated use.
Appendixes can include raw data, detailed statistical analyses and any other details that are important in supporting the report but that are not needed in the body of the report.
Additional information and references
Full copies of any reports, studies or other supporting information referenced in the report should be provided if they are used to justify a claim. They should be logically ordered in terms of their purpose and listed in a table of contents for easy reference.