The recent pandemic (H1N1) 2009 influenza outbreak has highlighted the importance of timely surveillance data to monitor epidemiological trends for guiding public health control measures.1 Collecting high-quality surveillance data is needed to gauge the timing and peak of the influenza season, and is an important pandemic preparedness activity.
Laboratory-confirmed influenza became notifiable in most Australian states and territories in 2001,2 and is now nationally notifiable. In theory, it should now be possible to compare influenza activity across the country. States and territories also conduct surveillance for influenza-like illness (ILI) during the influenza season using sentinel sites. Results from one such system have provided important early findings about pandemic (H1N1) 2009 influenza.1 However, the type of data and the way they are collected vary throughout the nation, resulting in a fragmented surveillance system.3 Comprehensive sentinel systems require committed general practitioners and concerted efforts to establish and maintain. Australia should aspire to having a uniform, national sentinel surveillance system, although funding and long-term maintenance issues would need to be addressed. Alternative methods of capturing influenza information include the online survey “FluTracking”4 and Google’s “Flu Trends”.5 The former currently lacks national coverage and neither system incorporates laboratory confirmation, meaning “false alarms” caused by other respiratory viruses may occur.
According to national surveillance data of laboratory-confirmed influenza, Queensland has had the most severe influenza seasons of all Australian states and territories in recent years (Box, A).7,8 Australian data available at Google’s Flu Trends do not support differences among states in ILI activity,5 and there are no clear reasons why Queensland should consistently suffer effects of influenza disproportionate to other states. It is likely, therefore, that this finding is influenced by information bias.
Over time, all three laboratories increased influenza testing (Box, B), but with different patterns. Queensland had the highest number of tests each year, with a consistent increase over the 5 years, while both Victoria and WA showed slower growth but stepwise increases in 2007. A severe influenza season in 2007 saw deaths reported in healthy children across the country,9 including from Queensland10 and three early season deaths of young children from WA.11 In that year, all three laboratories reported increased numbers of laboratory-confirmed cases of influenza (Box, A). The consistently higher and increasing test numbers in Queensland may be due to several factors, including active promotion by public health authorities of influenza testing by GPs,12 increased use of point-of-care testing, and widespread availability, with rapid turnaround, at both public and private laboratories of highly sensitive molecular laboratory testing.
Each state’s data show a concordance between the amount of testing performed and the number of positive results (Box, A and B). One method to compensate for the impact of testing behaviour is to calculate the proportion of positive results, reducing the role the number of tests performed has on absolute counts. Regular calculation of this value shows a remarkable correlation between the timing and peak of the season at each of the laboratories (Box, C); the correlation is independent of variations in testing (Box, B). Source of notification — inpatient, outpatient, or sentinel surveillance — was not available for each laboratory, but where it was, removing sentinel specimens made no difference to the conclusions drawn (data not shown).
There were different patterns in the proportion of annual state notifications that each laboratory provided (Box, D).13 QH laboratories had the highest increase in testing, but their contribution to state notifications was low and flat during the period, consistent with substantial and increasing testing in private laboratories. VIDRL’s contribution was initially high but fell quickly, then stepwise, throughout the 5 years, suggesting an increasing role of other public and private providers. PathWest provided about half of the WA notifications early in the period, but contributed more than half in 2007, probably due to increased testing related to high influenza activity and community concern fuelled by the childhood deaths.
The proportion of positive test results from sentinel practice surveillance samples has been used for monitoring overseas, including by the United Kingdom’s Health Protection Agency14 and the European Influenza Surveillance Network.15 But such data are only available where a sentinel surveillance system is in place. The proportion of positive test results was used recently in Victoria to describe the influenza season for the pandemic A(H1N1) 2009 outbreak,1 and in the United States to examine influenza vaccine effectiveness for preventing death in older people, using a 10% cut-off to define the season.16 Further, a Canadian paper defined periods of peak influenza activity as those months when the percentage of positive test results exceeded 7%,6 giving a mean influenza season of 3 months. Using this measure, duration of annual influenza seasons in Queensland, Victoria and WA ranged from 2 to 4 months (Box, C).
Calculating the proportion of all positive test results would reduce bias caused by variation in the number of tests performed, but this value may not be completely free of bias itself. For example, in many laboratories, including our own,17,18 specimens submitted for any respiratory virus polymerase chain reaction (PCR) test are subjected to a panel of assays. This may mean that during an outbreak of another respiratory virus, such as respiratory syncytial virus, influenza testing (and testing for all viruses on the panel) may be increased. This reduction in the proportion of results that are positive, caused by a testing artefact, would be misleading, but such a bias, we argue, would need to be persistent, strong and remain neglected to result in problems as serious as those caused by interpreting notification data that do not include negative test values.
Complex models have been suggested for monitoring influenza surveillance data in real-time.14 There is a myriad of ways influenza surveillance could be improved in Australia, such as implementing a GP- and hospital-based, uniform, nationwide, laboratory-supported sentinel surveillance scheme, and the reporting of influenza-related mortality in key age groups. We strongly support such proposals, but implementing negative-test result reporting should not be deferred while other reforms are being considered.
Influenza testing by laboratory, 2004–2008
A: monthly number of positive tests; B: monthly number of tests performed; |
- Stephen B Lambert1,2
- Cassandra E Faux1,2
- Kristina A Grant3
- Simon H Williams4
- Cheryl Bletchly5
- Michael G Catton6
- David W Smith4
- Heath A Kelly3
- 1 Queensland Paediatric Infectious Diseases Laboratory, Royal Children’s Hospital, Brisbane, QLD.
- 2 Clinical Medical Virology Centre, Sir Albert Sakzewski Virus Research Centre, University of Queensland, Brisbane, QLD.
- 3 Epidemiology Unit, Victorian Infectious Diseases Reference Laboratory, Melbourne, VIC.
- 4 PathWest Laboratory Medicine WA, Perth, WA.
- 5 Molecular Diagnostic Unit/Virology, Clinical and Statewide Services Division, Queensland Health, Brisbane, QLD.
- 6 Victorian Infectious Diseases Reference Laboratory, Melbourne, VIC.
None identified.
- 1. Kelly H, Grant K. Interim analysis of pandemic influenza (H1N1) 2009 in Australia: surveillance trends, age of infection and effectiveness of seasonal vaccination. Euro Surveill 2009; 14: pii:19288.
- 2. Blumer C, Roche P, Spencer J, et al. Australia’s notifiable diseases status, 2001: annual report of the National Notifiable Diseases Surveillance System. Commun Dis Intell 2003; 27: 1-78.
- 3. Watts C, Kelly H. Fragmentation of influenza surveillance in Australia. Commun Dis Intell 2002; 26: 8-13.
- 4. FluTracking.net [website]. http://www.flutracking.net/ (accessed May 2010).
- 5. Explore flu trends — Australia [website]. http://www.google.org/flutrends/intl/en_au/ (accessed May 2010).
- 6. Kwong JC, Maaten S, Upshur RE, et al. The effect of universal influenza immunization on antibiotic prescriptions: an ecological study. Clin Infect Dis 2009; 49: 750-756.
- 7. Chilcott T. Influenza cases high in Queensland. Courier Mail (Brisbane) 2008; 11 Sep. http://www.news.com.au/couriermail/story/0,23739, 24326123-5003426,00.html (accessed May 2010).
- 8. Pirani C. Influenza season could be worst in years. The Australian 2007; 11 Aug. http://www.theaustralian.news.com.au/story/0,25197,22225113-23289,00.html (accessed Dec 2009).
- 9. Edwards M. Experts urge calm after influenza deaths. ABC News 2007; 14 Aug. http://www.abc.net.au/news/stories/2007/08/14/2004956.htm (accessed May 2010).
- 10. Boy, 4, dies from influenza. Brisbane Times 2007; 3 Aug. http://www.brisbanetimes.com.au/articles/2007/08/02/1185648049715.html (accessed May 2010).
- 11. Western Australia Department of Health. Childhood influenza frequently asked questions. http://www.public.health.wa.gov.au/2/687/2/questions_and_answers__childhood_flu_vaccine.pm (accessed May 2010).
- 12. Begg K, Roche P, Owen R, et al. Australia’s notifiable diseases status, 2006: annual report of the National Notifiable Diseases Surveillance System. Commun Dis Intell 2008; 32: 139-207.
- 13. Department of Health and Ageing. National Notifiable Diseases Surveillance System. Notifications for all diseases by state and territory and year. http://www9.health.gov.au/cda/Source/Rpt_2_sel.cfm (accessed May 2010).
- 14. Health Protection Agency (UK). HPA national influenza weekly reports 2008/09. http://www.hpa.org.uk/HPA/Topics/InfectiousDiseases/InfectionsAZ/1191942171484/ (accessed May 2010).
- 15. European Influenza Surveillance Network. Weekly influenza surveillance overview. http://www.ecdc.europa.eu/en/activities/surveillance/EISN/Pages/home.aspx (accessed May 2010).
- 16. Fireman B, Lee J, Lewis N, et al. Influenza vaccination and mortality: differentiating vaccine effects from bias. Am J Epidemiol 2009; 170: 650-656.
- 17. Druce J, Tran T, Kelly H, et al. Laboratory diagnosis and surveillance of human respiratory viruses by PCR in Victoria, Australia, 2002–2003. J Med Virol 2005; 75: 122-129.
- 18. Syrmis MW, Whiley DM, Thomas M, et al. A sensitive, specific, and cost-effective multiplex reverse transcriptase-PCR assay for the detection of seven common respiratory viruses in respiratory samples. J Mol Diagn 2004; 6: 125-131.
Abstract
Laboratory-confirmed influenza is a nationally notifiable disease in Australia. According to notification data, Queensland has experienced more severe influenza seasons than other states and territories. However, this method ignores available denominator data: the number of laboratory tests performed.
We propose that negative results of laboratory tests for influenza should be made notifiable, alongside laboratory-confirmed disease, and used to calculate the proportion of positive test results in real-time.
Using data from the public health pathology services of three Australian states — Queensland Health laboratories, the Victorian Infectious Diseases Reference Laboratory and Western Australia’s PathWest — for 2004 to 2008, we show that incorporating laboratory-negative test data into national surveillance data would add to and improve our understanding of influenza epidemiology.