Preparing a study report (Efficacy and safety)

Applicants planning and conducting efficacy and crop safety trials should consider how the trial data will be collected, documented and prepared for presentation to the APVMA to support an application to register or vary a product. This guideline provides basic advice about the presentation of study reports that may be included in an efficacy dossier. You may use this advice and accompanying template, but are also free to use any alternative format. If you use an alternative format, you are responsible for ensuring that your template addresses all the features described below so that the report can be easily reviewed and understood by a reviewer.

This guideline should be read in conjunction with other APVMA guidelines, including any relevant product-specific guidelines and the guidelines on preparing an efficacy dossier and experimental design and analysis. For example particular features of a trial such as sampling parameters and test protocols may be required to be included and reported for certain types of products. See ‘Demonstrating efficacy of pool and spa sanitisers’ for more information.

Further details about each section of the report template are also provided to assist applicants preparing reports.

Figure 1: Study report template

Study/Trial identification
Quality assurance statement
Study summary/Executive summary/Abstract
Methods and materials Study particulars Experimental design Treatments Assessment methods
  • Raw data
  • analysis
Conclusions and discussions
Additional information/References

Guidance on report features

Study identification

Include an identifying study or report number that is unique to the report. This is especially important when there are a number of trials following the same protocol or with similar titles, as they can be easily confused, particularly during the creation of the data list and over the life of the application.

Provide the study or report title and the names of the primary author(s), editors and any other people involved, and details of their roles in the study. Include the date the report was finalised (for example, approved for release by the person responsible).

Study summary, executive summary or abstract

Provide a brief summary of the investigation, including a statement of the objective, general methods, results and conclusions. If the study was conducted for the purpose of registering a product in Australia, it should include reference to how the results support the product’s efficacy and proposed label instructions for use. Any constraints, such as economic, meteorological or animal welfare issues, should also be detailed.

Scope, introduction and background

The purpose of the study should be briefly stated to make it clear and set the scene for the reviewer. If the study is introducing new technology, methods, other unusual matters or different designs, including a brief explanation in the introduction and providing the details in the relevant sections would be appropriate. Alternatively, if the particular design has been used successfully in previous APVMA applications, it may be useful to provide some background on this for the reviewer, who might not be aware of past assessments. It may be of benefit to include the previous reviewer’s report as an attachment. The efficacy reviewer can only assess what is in the dossier provided, so you should include all relevant aspects of the trial and any ‘industry’ information that will be needed to understand the conduct of the trial and the particular use pattern.

Methods and materials


Trial results and the conclusions drawn from them have little value if the report does not adequately explain how the trial was conducted. If the trial followed protocols previously agreed with us, or published methodology such as APVMA or internationally accepted guidelines, this should be stated and explained where necessary. If the trial methodology is detailed or lengthy, it can be included as an appendix to the main report; provide only the main points in the body of report.

Study particulars

Details of the research establishment, the study coordinator, other personnel, the exact location and the dates of commencement and completion of the trial must be recorded and documented in the submission. Provide scientific (Latin binomial) names for any crops, pests, weeds or other organisms discussed (common names, though useful, are imprecise and too variable to be used by themselves). Where necessary, provide relevant meteorological information (rainfall, temperature, humidity, etc.) and agronomic details (soil type, cultivars, crop rotations, etc.) for both local and overseas studies. These details could be presented in table format with each study and may be included as an appendix.

Experimental design

Full details of the experimental design (for example, small plot, replicated, randomised complete block design), numbers of replicates and plot sizes should be provided for each study. The following will provide useful information for the reviewer:

  • definition of the null (for example, no difference between treatments or products) and alternative hypotheses (for example, the proposed new product is better than the existing product) and the appropriate level of statistical significance for rejecting or accepting the hypothesis (for example, p < 0.05)
  • methodology for the statistical or biometrical analysis proposed, statistical justification for numbers of animals or groups used (including the need to replicate), statistical power and confidence level of the data
  • standard operating procedures that are study-specific and other records, which may be appended to the submission
  • details of controls used as comparisons in the statistical analysis (it is important for the reviewer to be assured that controls have been exposed to similar environmental conditions and study conditions).

For further details on experimental design, refer to the Efficacy experimental design and analysis guideline.


Give details of the test product(s) and of any standards or reference products (controls) included in the studies (such as the product name or experimental code, active constituent name and content). This should also include the active constituent level and/or formulation details of the reference product and any other products used in the trial.

Provide full details of all treatment applications. It is important that the treatments are consistent with current agricultural practice. This could include information such as:

  • application rate in terms of product per unit or active constituent per unit (including per unit canopy row)
  • frequency of applications and application intervals
  • details of any other products used, either separately or in combination
  • carrier volume (L/ha) and type (for example, diesel or water)
  • type of application equipment
  • other application details, such as nozzle type, spray quality, pressure (kPa, bar) and boom height
  • date of application
  • crop growth stage at application and crop part treated
  • pest population or developmental stage or infestation level at time of application.

If you use an experimental code, explain how the coded product relates to the proposed product. If more than one formulation has been used in development studies, give full formulation details and, where relevant, details of bridging or bioequivalence studies. If earlier formulations of the product or other products containing the same active constituent(s) are cited in supporting evidence, explain the relevance of this evidence to the current formulation.

Assessment methods

All assessment methods should be documented, explained and, where relevant, appropriately referenced.

Examples of critical information that could be documented include:

  • date or interval after application, or interval prior to harvest
  • for crop uses, crop growth stage key (for example, Zadok’s scale)
  • efficacy assessment method (for example, European Weed Research Council rating scale, or number of individuals in the monitoring area counted per 30 seconds)
  • sample size (such as 20 tillers per plot or 5 soil cores per plot)
  • sample method (for example, random or fixed)
  • harvest date for crop applications
  • harvest method (for example, small plot combine)
  • harvested commodity measurements (such as weight and moisture content)
  • expression of yield (for example, tonne per hectare at 85 per cent dry matter)
  • explanation of any abbreviations used.

For crop protection products, you should provide yield data for all treatments (including standards and controls) to demonstrate the effect of the product on yield. An acceptable argument or explanation should be provided if yield data are not included.



All relevant data should be presented and reported. Providing results from a ‘typical sample’ is not acceptable. When only means, percentages or other presentations of results are given in the results section of a report, all ‘raw’ data should be provided in an appendix.

If negative or unusual results have been recorded, they must be included together with a discussion (below) about how or why they may have occurred. This information can help in determining optimal application conditions.

Statistical analysis

You should conduct a statistical analysis of the results whenever relevant or if you are required to demonstrate differences or equivalence between treatments.

Provide full details of the statistical methods used, including a justification or validation for the method chosen. Include information about how the underlying assumptions of the statistical method have been met in the justification for the method selection. Include any reasons for not carrying out a statistical analysis if you have not done one. The results of the statistical analysis (for example, degrees of freedom, F-values and p values) should be presented in table format with each study. If there are many such analyses, they could be provided as an appendix. Novel statistical analysis submitted in support of experimental data should be accompanied by the raw data and the published literature that references the statistical technique.


If a study produces many separate results (for example, from different treatments and assessment times), the data are best presented in a table or matrix format. This allows a quick and easy comparison of results.

If graphs or other methods of presentation are used, they should be appropriately labelled with measurement details, including the relevant units. The type of presentation of results should be similar to that expected in a peer-reviewed journal. Original or raw data should be included and may be submitted as an Appendix to the report.

Any abbreviations or indications of statistically significant results used in a table must be explained as notes to the table. Examples of useful table formats are shown below in Table 1 and Table 2.

Table 1: Example of a simple comparison of before and after treatments
Treatment Result prior to treatment (unit of measurement) Result after treatment (unit of measurement)
Product A, rate 1 11a 5b
Product A, rate 2 13a 1a
Product B 12a 2a
Product C 13a 1a
No treatment control 12a 11c

Note: Results with different letters within a column denote a statistically significant difference.

Table 2: Example of a more complex comparison of multiple results per treatment
Treatment Measurement of two related results
(for example, % control/g weight loss)
Day 0 1 DAT 7 DAT 28 DAT
A1 % g % g % g % g
A2 % g % g % g % g
B % g % g % g % g
C % g % g % g % g
Nil control % g % g % g % g

DAT = days after treatment.

For further examples of how to tabulate results, refer to Regulatory Directive DIR 2003-04, Efficacy guidelines for plant protection products, Pest Management Regulatory Authority, Health Canada.

The report should include text that highlights the important results or significant outliers, including any significant differences that are important to testing the study hypothesis (or lack of them). Ideally, this should be located close to the relevant table and include a reference to the table. Descriptions of results should be concise, and the discussion or interpretation of the results should be left to the discussions and conclusions section (see discussions and conclusions).

Discussion and conclusions

Each trial should be appropriately analysed, the results interpreted and a relevant conclusion about the purpose and hypotheses of the study stated. Conclusions must be clear, specific and, wherever possible, related to the proposed use of the product as instructed on the product label (for example: ‘The results of the trial support a label use rate of x g/100 L when applied via ground application equipment; another application may be required after 14 days if monitoring indicates pest numbers will exceed thresholds, etc.’). Do not present conclusions that are not relevant to the purpose of the study.

This section is where any unusual or unexpected results should also be discussed. If possible, explain how they occurred. It is not unusual for an efficacy dossier to include trial reports in which there is considerable variability in trial results. Issues such as unusually low or high pest abundance or other confounding factors (for example, weather effects or soil types) may need to be discussed to allow a proper interpretation of the results.

We might not consider variability of results negatively if the variability can be adequately explained. Variability of results may lead to label instructions that advise users of the possibility of variable results and how they may avoid them. This section could also discuss the integration of the proposed product with current pest management practices. Economic action thresholds for specific crops should be detailed to allow the assessor to evaluate the likelihood of repeated use.

This section could also discuss the integration of the proposed product with current pest management practices. Economic action thresholds for specific crops should be detailed to allow the assessor to evaluate the likelihood of repeated use.


Appendixes can include raw data, detailed statistical analyses and any other details that are important in supporting the report but that are not needed in the body of the report.

Additional information and references

Full copies of any reports, studies or other supporting information referenced in the report should be provided if they are used to justify a claim. They should be logically ordered in terms of their purpose and listed in a table of contents for easy reference.

To protect your privacy, please do not include contact information in your feedback. If you would like a response, please contact us.