Australia implemented one of the world’s first health technology assessment (HTA) programs in 1982.1-3 The government body now responsible for the management of Australia’s HTA program is the Medical Services Advisory Committee (MSAC). The MSAC comprises 22 members from specialist health care, health administration, consumer and academic health (epidemiologists, health economists, etc) who work together to fulfil the MSAC’s terms of reference:4
Advise the federal Minister for Health and Ageing on the strength of evidence pertaining to new or emerging medical technologies and procedures in relation to their safety, effectiveness (clinical), and cost-effectiveness, and under what circumstances public funding should be supported;
Advise the Minister on which new medical technologies and procedures should be funded on an interim basis, to allow further data to be assembled to determine their safety, effectiveness and cost-effectiveness;
Advise the Minister on references relating to new or existing medical technologies and procedures; and
Undertake HTA work referred by the Australian Health Ministers’ Advisory Council.
Further details including current membership and administrative arrangements are available on the MSAC website (http://www.msac.gov.au).
Although there is an expanding body of literature examining evaluation methods or the results of their application,3,5 little is known of the methods used to provide the evidence that informs policy when introducing new medical technologies to the Medicare Benefits Schedule (MBS) or of the quality of methods used to undertake HTAs for the MSAC. Here, we examine the methods used to produce the section of MSAC HTA reports that reviews effectiveness of the technology or procedure.
An HTA commissioned by the MSAC takes the form of a systematic review. The review itself is undertaken by an independent group contracted by the MSAC under the guidance of an advisory panel formed by the MSAC, whose membership may comprise any combination of clinical experts (nominated by relevant clinical colleges), consumers of the technology, epidemiologists and health economists. On completion of the review, the MSAC can make one of three recommendations to the Minister for Health and Ageing: to fund the technology or procedure without restriction; to provide interim funding and have further evaluation take place at some future point; or to not fund the technology or procedure. This process provides a formal means by which new and emerging technologies and procedures may gain entry to the MBS. Once a technology or procedure has been listed on the MBS, a proportion of its cost will be met by government reimbursement.3
The Australian National Health and Medical Research Council (NHMRC) has provided much of the methodology for evaluating new technologies, with formalised levels of evidence that reflect susceptibility to bias within particular study designs and by which the validity of primary studies can be assessed.6
To be eligible for inclusion in this evaluation, HTAs had to:
be accessible in full text on the MSAC website;
have a published recommendation regarding the decision to either fund or not fund the technology or procedure through the MBS;
review the effectiveness of a treatment; and
be based on an application.
the full text of the HTA was not available on the MSAC website;
the HTA contained commercial-in-confidence information that prevented publication in full on the MSAC website;
the HTA did not review the effectiveness of a treatment; or
the HTA was based on a reference from the Department of Health and Ageing rather than an application.
Data were extracted from the HTA reports by two investigators using a standardised data extraction form. The extracted covariates were predominantly based on the criteria developed by Busse et al to assess the quality of HTA reports;7 however, we restricted our data extraction to the section of each HTA report dealing with effectiveness of the technology or procedure. Details of the data extracted are shown in Box 1. For most of the covariates, we simply assessed whether the variable had been reported in the HTA (yes, no or unclear). Disagreements between the reviewers were resolved by consensus, with a third reviewer available (although not required) for adjudication.
Of the 56 available HTA reports of applications to the MSAC, 31 met the inclusion criteria. Box 2 summarises the selection process for HTAs and the reasons for exclusion. Of the 31 included HTAs, six were published in 1999, five in 2000, six in 2001, six in 2002, six in 2003 and two in 2005. Results of the data extraction are shown in Box 1. Sixteen of the 31 technologies or procedures assessed in these HTAs were recommended for funding, either on a permanent or interim basis.
The type of evidence included in the effectiveness section of the HTA reports was classified according to a modified version of the NHMRC designation of levels of evidence6 in most reports. Level I evidence (a systematic review) was included in eight HTAs. Level II evidence (randomised controlled trials) was included in 12 HTAs, with the number of randomised controlled trials included in each of these ranging from one to 16. Level III-1 evidence (pseudo-randomised controlled trials) was only included in three HTAs, and level III-2 evidence (comparative studies with concurrent controls and allocation not randomised, cohort studies, case–control studies, or interrupted time series with a control group) was also rare, being included in only four HTAs. Level IV evidence (case series, either post-test or pre-test/post-test) was included in half of all HTAs (16/31). Interestingly, the level of evidence of some included studies was not reported in five HTAs.
Eighteen HTAs provided a validity assessment of included studies with explicit details of the methods used. The methods used were unclear in nine HTAs, and in the remainder of reports it was assumed no validity assessment had been undertaken. The most commonly reported method to assess the validity of included studies, used in five HTAs, was the Cochrane handbook for systematic reviews of interventions;8 other methods used in at least two reports were the Centre for Reviews and Dissemination handbook,9 Quality of Reporting of Meta-analyses (QUOROM) checklist10 and assessment methods proposed by Schulz et al11 and Greenhalgh.12 In total, at least seven different validity appraisal methods were used in these reports. In other cases it was difficult to determine whether formal validity assessments had been undertaken, as, although there were elements of validity assessment in the reports, the exact methods used were not reported. Both the descriptive elements and the results of included studies were presented in both tables and text in 30 HTAs. Due to a reported lack of appropriate studies, meta-analysis was performed in only three reports.
Conflict of interest is of concern because of the perception that it could lead to unreasonable bias in an HTA report.13 No information was provided within the published HTA reports, nor is there any documentation available publicly, which details how conflicts of interest are handled beyond the recording of such information by the MSAC.9,14 Other international HTA agencies have provided greater detail of how conflicts of interest are both identified and recorded.15,16
In undertaking our study we used the criteria of Busse et al, which were published in 2002.7 Most of the HTAs we evaluated were conducted before these criteria were published, and indeed there is some evidence that reporting is improving, with an increase in the number of reports including details of the methods used over the time period examined. Optimal search strategies for primary studies to be included in the effectiveness and cost-effectiveness sections of HTAs have been suggested and may prove useful in future reports as a minimum standard to which report authors could adhere.17 Problems with reporting of systematic reviews and HTAs have been noted previously. A study by Olsen et al found that 29% of new Cochrane systematic reviews published in 1998 had major problems,18 and the authors specifically highlighted three areas of concern: that the evidence did not support the conclusions; that the conduct or reporting of reviews was unsatisfactory; and that reviews had stylistic concerns. A study conducted in Canada, which evaluated the conduct of reports from four HTA agencies in that country, found similar results to our study in that almost half of all reports failed to specify the methods used.19
The MSAC has produced its own guidance for evaluators undertaking HTAs,14 and this has recently been updated to include information on undertaking reviews of diagnostic and screening technologies.20 Other organisations such as the Cochrane Collaboration and the QUOROM group have produced guidance on the undertaking and reporting of systematic reviews and meta-analyses.8,10
1 Data extracted from 31 health technology assessment (HTA) reports produced for the Medical Services Advisory Committee, 1999–2005,* based on criteria developed by Busse et al7
- Emily S Petherick1
- Elmer V Villanueva2
- Jo Dumville1
- Emma J Bryan3
- Shyamali Dharmage4
- 1 Department of Health Sciences, University of York, York, UK.
- 2 Department of Rural and Indigenous Health, Monash University, Moe, VIC.
- 3 Monash Institute of Health Services Research, Monash University, Melbourne, VIC.
- 4 School of Population Health, University of Melbourne, Melbourne, VIC.
We thank Ms Alexandra Raulli, Dr Alison Orrell and Professor Nicky Cullum for their positive encouragement and feedback on the manuscript.
Emily Petherick, Elmer Villanueva and Emma Bryan were authors on several of the MSAC HTAs evaluated in this study. No funding was received to carry out this study.
- 1. Goodman C. An introduction to health technology assessment. Falls Church, Va: The Lewin Group, 1998.
- 2. Hailey D. Health care technology in Australia. Health Policy 1994; 30: 23-72.
- 3. Hailey DM. Health technology assessment in Australia: a need to re-focus. J Qual Clin Pract 1996; 16: 123-129.
- 4. Medical Services Advisory Committee. What is MSAC? http://www.msac.gov.au/internet/msac/publishing.nsf/Content/what-is-1 (accessed Mar 2007).
- 5. May C, Mort M, Williams T, et al. Health technology assessment in its local contexts: studies of telehealthcare. Soc Sci Med 2003; 57: 697-710.
- 6. National Health and Medical Research Council. How to use the evidence: assessment and application of scientific evidence. Canberra: NHMRC, 2000.
- 7. Busse R, Velasco M, Perleth M, et al. Best practice in undertaking and reporting health technology assessments. Working group 4 report. Int J Technol Assess Health Care 2002; 18: 361-422.
- 8. Higgins J, Green S, editors. Cochrane handbook for systematic reviews of interventions. Version 4.2.6. The Cochrane Library, Issue 4, 2006. Chichester, UK: John Wiley & Sons, Ltd.
- 9. Centre for Reviews and Dissemination. Undertaking systematic review of research on effectiveness: CRD’s guidance for those carrying out or commissioning reviews. York: CRD, 2001. http://www.york.ac.uk/inst/crd/report4.htm (accessed Jul 2007).
- 10. Moher D, Cook DJ, Eastwood S, et al. Improving the quality of reports of meta-analyses of randomised controlled trials: the QUOROM statement. Quality of Reporting of Meta-analyses. Lancet 1999; 354: 1896-1900.
- 11. Schulz KF, Chalmers I, Hayes RJ, Altman DG. Empirical evidence of bias. Dimensions of methodological quality associated with estimates of treatment effects in controlled trials. JAMA 1995; 273: 408-412.
- 12. Greenhalgh T. Assessing the methodological quality of published papers. BMJ 1997; 315: 305-308.
- 13. Hailey D. Towards transparency in health technology assessment: a checklist for HTA reports. Int J Technol Assess Health Care 2003; 19: 1-7.
- 14. Medical Services Advisory Committee. Funding for new medical technologies and procedures: application and assessment guidelines. Canberra: MSAC, 2000.
- 15. Medical Advisory Secretariat, Ontario. Technologies for osteoarthritis of the knee. Integrated health technology policy assessment. October 2005. https://ospace.scholarsportal.info/bitstream/1873/1875/1/259170.pdf (accessed Jul 2007).
- 16. NHS Research & Development. The HTA programme. The principles underlying the work of the National Coordinating Centre for Health Technology assessment. March 2007. http://www.ncchta.org/sundry/probity.pdf (accessed Jul 2007).
- 17. Centre for Reviews and Dissemination. Finding studies for systematic reviews: a checklist for researchers. York: CRD, 2006. http://www.york.ac.uk/inst/crd/revsrch.doc (accessed Jul 2007).
- 18. Olsen O, Middleton P, Ezzo J, et al. Quality of Cochrane reviews: assessment of sample from 1998. BMJ 2001; 323: 829-832.
- 19. Menon D, Topfer LA. Health technology assessment in Canada. A decade in review. Int J Technol Assess Health Care 2000; 16: 896-902.
- 20. Medical Services Advisory Committee. Guidelines for the assessment of diagnostic technologies. Canberra: MSAC, 2005.
Abstract
Objective: To examine the methods used in health technology assessments (HTAs) produced for the Medical Services Advisory Committee (MSAC) reviewing the effectiveness of a technology or procedure.
Design and setting: Data were extracted from the effectiveness section of HTA application assessment reports published between 1 January 1998 and 17 July 2006 and available on the MSAC website. Only HTAs of effectiveness interventions were examined, as the methods used to undertake such reviews are well established.
Main outcome measures: Variables reflecting methods used in the HTAs to evaluate the effectiveness of health technologies or procedures.
Results: Of 56 MSAC HTA reports available, 31 met the inclusion criteria. Considerable variability was shown to exist between the various indicators of quality and the methodology used within the HTAs. Reports did not describe potential conflicts of interest of participants. The majority of reports (19/31) did not formally state the research question that the assessment was attempting to answer. Just over half of the reports (18/31) provided details of validity assessment of the included studies.
Conclusions: Minimum and consistent standards of methodology and reporting are required in Australian HTAs, using international recommendations of best practice to increase the transparency and applicability of these reports.