Featured Publications


Filter options:

Adam Akullian , Michelle Morrison, Geoffrey P Garnett, Zandile Mnisi, Nomthandazo Lukhele, Daniel Bridenbecker, Anna Bershteyn



The rapid scale-up of antiretroviral therapy (ART) towards the UNAIDS 90-90-90 goals over the last decade has sparked considerable debate as to whether universal test and treat can end the HIV-1 epidemic in sub-Saharan Africa. We aimed to develop a network transmission model, calibrated to capture age-specific and sex-specific gaps in the scale-up of ART, to estimate the historical and future effect of attaining and surpassing the UNAIDS 90-90-90 treatment targets on HIV-1 incidence and mortality, and to assess whether these interventions will be enough to achieve epidemic control (incidence of 1 infection per 1000 person-years) by 2030.


We used eSwatini (formerly Swaziland) as a case study to develop our model. We used data on HIV prevalence by 5-year age bins, sex, and year from the 2007 Swaziland Demographic Health Survey (SDHS), the 2011 Swaziland HIV Incidence Measurement Survey, and the 2016 Swaziland Population Health Impact Assessment (PHIA) survey. We estimated the point prevalence of ART coverage among all HIV-infected individuals by age, sex, and year. Age-specific data on the prevalence of male circumcision from the SDHS and PHIA surveys were used as model inputs for traditional male circumcision and scale-up of voluntary medical male circumcision (VMMC). We calibrated our model using publicly available data on demographics; HIV prevalence by 5-year age bins, sex, and year; and ART coverage by age, sex, and year. We modelled the effects of five scenarios (historical scale-up of ART and VMMC [status quo], no ART or VMMC, no ART, age-targeted 90-90-90, and 100% ART initiation) to quantify the contribution of ART scale-up to declines in HIV incidence and mortality in individuals aged 15–49 by 2016, 2030, and 2050.


Between 2010 and 2016, status-quo ART scale-up among adults (aged 15–49 years) in eSwatini (from 34·0% in 2010 to 74·1% in 2016) reduced HIV incidence by 43·57% (95% credible interval 39·71 to 46·36) and HIV mortality by 56·17% (54·06 to 58·92) among individuals aged 15–49 years, with larger reductions in incidence among men and mortality among women. Holding 2016 ART coverage levels by age and sex into the future, by 2030 adult HIV incidence would fall to 1·09 (0·87 to 1·29) per 100 person-years, 1·42 (1·13 to 1·71) per 100 person-years among women and 0·79 (0·63 to 0·94) per 100 person-years among men. Achieving the 90-90-90 targets evenly by age and sex would further reduce incidence beyond status-quo ART, primarily among individuals aged 15–24 years (an additional 17·37% [7·33 to 26·12] reduction between 2016 and 2030), with only modest additional incidence reductions in adults aged 35–49 years (1·99% [–5·09 to 7·74]). Achieving 100% ART initiation among all people living with HIV within an average of 6 months from infection—an upper bound of plausible treatment effect—would reduce adult HIV incidence to 0·73 infections (0·55 to 0·92) per 100 person-years by 2030 and 0·46 (0·33 to 0·59) per 100 person-years by 2050.


Scale-up of ART over the last decade has already contributed to substantial reductions in HIV-1 incidence and mortality in eSwatini. Focused ART targeting would further reduce incidence, especially in younger individuals, but even the most aggressive treatment campaigns would be insufficient to end the epidemic in high-burden settings without a renewed focus on expanding preventive measures.


Global Good Fund and the Bill & Melinda Gates Foundation.


Vector control has been a key component in the fight against malaria for decades, and chemical insecticides are critical to the success of vector control programs worldwide. However, increasing resistance to insecticides threatens to undermine these efforts. Understanding the evolution and propagation of resistance is thus imperative to mitigating loss of intervention effectiveness. Additionally, accelerated research and development of new tools that can be deployed alongside existing vector control strategies is key to eradicating malaria in the near future. Methods such as gene drives that aim to genetically modify large mosquito populations in the wild to either render them refractory to malaria or impair their reproduction may prove invaluable tools. Mathematical models of gene flow in populations can offer invaluable insight into the behavior and potential impact of gene drives as well as the spread of insecticide resistance in the wild. Here, we present the first multi-locus, agent-based model of vector genetics that accounts for mutations and many-to-many mappings of genotypes to phenotypes to investigate gene flow and the propagation of gene drives in Anopheline populations. This model is embedded within a large scale individual-based model of malaria transmission representative of a high burden, high transmission setting characteristic of the Sahel. Results are presented for the selection of insecticide-resistant vectors and the spread of resistance through repeated deployment of insecticide treated nets (ITNs), in addition to scenarios where gene drives act in concert with existing vector control tools such as ITNs. The roles of seasonality, spatial distribution of vector habitat and feed sites, and existing vector control in propagating alleles that confer phenotypic traits via gene drives that result in reduced transmission are explored. The ability to model a spectrum of vector species with different genotypes and phenotypes in the context of malaria transmission allows us to test deployment strategies for existing interventions that reduce the deleterious effects of resistance and allows exploration of the impact of new tools being proposed or developed.


Since the prequalification of the Typhoid conjugate vaccine (TCV) by the WHO and subsequent position paper published in 2018, strategies for roll-out of the vaccine have been under discussion [1]. The 2018 position paper recommends the introduction of TCV to be prioritized in countries with the highest burden of typhoid disease or a high burden of antimicrobial resistant S. Typhi [1]. The paper further suggests that “Decisions on the age of TCV administration, target population and delivery strategy for routine and catch-up vaccination should be based on the local epidemiology of typhoid fever…”. However, local epidemiology of typhoid fever is often poorly documented, due to the paucity of diagnostic facilities in many high typhoid incidence settings. However, most low- and middle income- countries (LMIC) rely on ad hoc reporting of typhoid fever, and very few have data from more than one city in the country. There have been substantial efforts aimed at strengthening blood culture surveillance for typhoid fever in Africa [2], yet there are still only 13 sentinel sites in 10 countries; a similar initiative in Asia covers only four countries [3]. Data sets that are utilized to estimate global burden are therefore limited by the lack of surveillance [4][5][6][7]. Based on the prohibitive costs [2] and efforts required to strengthen blood culture surveillance in LMIC, expansion of these efforts to capture both national and sub-national trends of typhoid on a global scale are not likely on a time scale relevant to vaccine roll-out.

Incidence mapping using statistical models can aid in predicting incidence in areas without surveillance, using spatial covariates relevant to risk of disease, and has been used for diseases such as malaria [8]. This approach has been attempted for typhoid through global burden models [4], but out-of-sample validation, though accurate in some areas, was not reliable, indicating a lack of useful indicators that can be consistently used to predict typhoid incidence. Further, the current breadth of data is heavily biased by reporting from a handful of well-funded sites, so predicting sub-national incidence across large regions is a challenge. A country’s ability to roll out TCV in accordance with the WHO’s position paper is therefore hindered by a lack of knowledge of local epidemiology of the disease. Additionally, Gavi, The Vaccine Alliance, recommends that countries requesting TCV funding should submit epidemiological data from within-country whenever possible, though this is not strictly a requirement.

Alternative tools are needed for planning TCV strategies in the absence of blood culture surveillance. Of particular interest is environmental surveillance, where, instead of relying on clinical detection of the disease, catchments in the environment such as water or sewage systems are surveilled. This approach has been successfully used in the polio eradication campaign. [9] Though case-based surveillance for polio is widespread, the disease is known to undergo sub-clinical (silent) transmission. ES has enabled detection when there is not a known outbreak and has been demonstrated to be a useful tool in program decision making [10][11]. Since typhoid and polio share similarities with regards to transmission routes and sub-clinical disease, it is possible that the approach and the network of laboratories developed for polio could be adapted for typhoid.

There remain significant technical challenges to implementing typhoid environmental surveillance (ES); optimal sampling strategies and detection methods, and their reliability as an indicator of ongoing transmission, remain unclear. Historically, Moore swabs have been used to isolate S. Typhi from sewage [12][13], however present day ES initiatives have been more focused on molecular approaches, specifically polymerase chain reaction (PCR)-based detection of S Typhi [14][15][16].

Economic analyses have largely supported the cost-effectiveness of the roll out of TCV in high and medium- incidence areas, particularly when routine vaccination strategies are paired with catch-up campaigns [17][18], however, there is more uncertainty around cost-effectiveness in low-incidence areas [19]. In this study, we examine the use of a hypothetical environmental surveillance program as a method for quickly gathering evidence on which an introduction decision can be based. This is especially relevant in places where there are inadequate burden estimates or in which a national introduction may not be affordable due to funding constraints or competing priorities. Specifically, we evaluate the value of environmental sampling as a means of detecting circulating typhoid in order to guide local or national targeting of catch-up vaccination campaigns. We aim to determine the most cost-effective sampling and roll-out strategies, given the limited information and substantial uncertainty about the true underlying prevalence of typhoid.

Michelle A Bulterys, Bradley Wagner, Mael Redard-Jacot, Anita Suresh, Nira R. Pollock, Emmanuel Moreau, Claudia M. Denkinger, Paul K. Drain, Tobias Broger 


Most diagnostic tests for tuberculosis (TB) rely on sputum samples, which are difficult to obtain and have low sensitivity in immunocompromised patients, patients with disseminated TB, and children, delaying treatment initiation. The World Health Organization (WHO) calls for the development of a rapid, biomarker-based, non-sputum test capable of detecting all forms of TB at the point-of-care to enable immediate treatment initiation. Lipoarabinomannan (LAM) is the only WHO-endorsed TB biomarker that can be detected in urine, an easily collected sample. This status update discusses the characteristics of LAM as a biomarker, describes the performance of first-generation urine LAM tests and reasons for slow uptake, and presents considerations for developing the next generation of more sensitive and impactful tests. Next-generation urine LAM tests have the potential to reach adult and pediatric patients regardless of HIV status or site of infection and facilitate global TB control. Implementation and scale-up of existing LAM tests and development of next-generation assays should be prioritized.



Ambitious global goals have been established to provide universal access to affordable modern contraceptive methods. To measure progress toward such goals in populous countries like Nigeria, it’s essential to characterize the current levels and trends of family planning (FP) indicators such as unmet need and modern contraceptive prevalence rates (mCPR). Moreover, the substantial heterogeneity across Nigeria and scale of programmatic implementation requires a sub-national resolution of these FP indicators. The aim of this study is to estimate the levels and trends of FP indicators at a subnational scale in Nigeria utilizing all available data and accounting for survey design and uncertainty.


We utilized all available cross-sectional survey data from Nigeria including the Demographic and Health Surveys, Multiple Indicator Cluster Surveys, National Nutrition and Health Surveys, and Performance, Monitoring, and Accountability 2020. We developed a hierarchical Bayesian model that incorporates all of the individual level data from each survey instrument, accounts for survey uncertainty, leverages spatio-temporal smoothing, and produces probabilistic estimates with uncertainty intervals.


We estimate that overall rates and trends of mCPR and unmet need have remained low in Nigeria: the average annual rate of change for mCPR by state is 0.5% (0.4%,0.6%) from 2012-2017. Unmet need by age-parity demographic groups varied significantly across Nigeria; parous women express much higher rates of unmet need than nulliparous women.


Understanding the estimates and trends of FP indicators at a subnational resolution in Nigeria is integral to inform programmatic decision-making. We identify age-parity-state subgroups with large rates of unmet need. We also find conflicting trends by survey instrument across a number of states. Our model-based estimates highlight these inconsistencies, attempt to reconcile the direct survey estimates, and provide uncertainty intervals to enable interpretation of model and survey estimates for decision-making.

Alain Vandormael, Adam Akullian, Mark Siedner, Tulio de Oliveira, Till Bärnighausen and Frank Tanser


Over the past decade, there has been a massive scale-up of primary and secondary prevention services to reduce the population-wide incidence of HIV. However, the impact of these services on HIV incidence has not been demonstrated using a prospectively followed, population-based cohort from South Africa—the country with the world’s highest rate of new infections. To quantify HIV incidence trends in a hyperendemic population, we tested a cohort of 22,239 uninfected participants over 92,877 person-years of observation. We report a 43% decline in the overall incidence rate between 2012 and 2017, from 4.0 to 2.3 seroconversion events per 100 person-years. Men experienced an earlier and larger incidence decline than women (59% vs. 37% reduction), which is consistent with male circumcision scale-up and higher levels of female antiretroviral therapy coverage. Additional efforts are needed to get more men onto consistent, suppressive treatment so that new HIV infections can be reduced among women.

Emmanuel P. Mwanga, Elihaika G. Minja, Emmanuel Mrimi, Mario González Jiménez, Johnson K. Swai, Said Abbasi, Halfan S. Ngowo, Doreen J. Siria, Salum Mapua, Caleb Stica, Marta F. Maia, Ally Olotu, Maggy T. Sikulu-Lord, Francesco Baldini, Heather M. Ferguson, Klaas Wynne, Prashanth Selvaraj, Simon A. Babayan & Fredros O. Okumu



Epidemiological surveys of malaria currently rely on microscopy, polymerase chain reaction assays (PCR) or rapid diagnostic test kits for Plasmodium infections (RDTs). This study investigated whether mid-infrared (MIR) spectroscopy coupled with supervised machine learning could constitute an alternative method for rapid malaria screening, directly from dried human blood spots.


Filter papers containing dried blood spots (DBS) were obtained from a cross-sectional malaria survey in 12 wards in southeastern Tanzania in 2018/19. The DBS were scanned using attenuated total reflection-Fourier Transform Infrared (ATR-FTIR) spectrometer to obtain high-resolution MIR spectra in the range 4000 cm−1 to 500 cm−1. The spectra were cleaned to compensate for atmospheric water vapour and CO2 interference bands and used to train different classification algorithms to distinguish between malaria-positive and malaria-negative DBS papers based on PCR test results as reference. The analysis considered 296 individuals, including 123 PCR-confirmed malaria positives and 173 negatives. Model training was done using 80% of the dataset, after which the best-fitting model was optimized by bootstrapping of 80/20 train/test-stratified splits. The trained models were evaluated by predicting Plasmodium falciparum positivity in the 20% validation set of DBS.


Logistic regression was the best-performing model. Considering PCR as reference, the models attained overall accuracies of 92% for predicting P. falciparum infections (specificity = 91.7%; sensitivity = 92.8%) and 85% for predicting mixed infections of P. falciparum and Plasmodium ovale (specificity = 85%, sensitivity = 85%) in the field-collected specimen.


These results demonstrate that mid-infrared spectroscopy coupled with supervised machine learning (MIR-ML) could be used to screen for malaria parasites in human DBS. The approach could have potential for rapid and high-throughput screening of Plasmodium in both non-clinical settings (e.g., field surveys) and clinical settings (diagnosis to aid case management). However, before the approach can be used, we need additional field validation in other study sites with different parasite populations, and in-depth evaluation of the biological basis of the MIR signals. Improving the classification algorithms, and model training on larger datasets could also improve specificity and sensitivity. The MIR-ML spectroscopy system is physically robust, low-cost, and requires minimum maintenance.


The compartmental modeling software (CMS) is an open source computational framework that can simulate discrete, stochastic reaction models which are often utilized to describe complex systems from epidemiology and systems biology. In this article, we report the computational requirements, the novel input model language, the available numerical solvers, and the output file format for CMS. In addition, the CMS code repository also includes a library of example model files, unit and regression tests, and documentation. Two examples, one from systems biology and the other from computational epidemiology, are included that highlight the functionality of CMS. We believe the creation of computational frameworks such as CMS will advance our scientific understanding of complex systems as well as encourage collaborative efforts for code development and knowledge sharing.


This paper is concerned with the design of an augmented state feedback controller for finite-dimensional linear systems with nonlinear observation dynamics. Most of the theoretical results in the area of (optimal) feedback design are based on the assumption that the state is available for measurement. In this paper, we focus on finding a feedback control that avoids state trajectories with undesirable observability properties. In particular, we introduce an optimal control problem that specifically considers an index of observability in the control synthesis. The resulting cost functional is a combination of LQR-like quadratic terms and an index of observability. The main contribution of the paper is presenting a control synthesis procedure that on one hand, provides closed loop asymptotic stability, and addresses the observability of the system – as a transient performance criterion – on the other.



While bed nets and insecticide spraying have had significant impact on malaria burden in many endemic regions, outdoor vector feeding and insecticide resistance may ultimately limit their contribution to elimination and control campaigns. Complementary vector control methods such as endectocides or systemic insecticides, where humans or animals are treated with drugs that kill mosquitoes upon ingestion via blood meal, are therefore generating much interest. This work explores the conditions under which long-lasting systemic insecticides would have a substantial impact on transmission and burden.


Hypothetical long-lasting systemic insecticides with effective durations ranging from 14 to 90 days are simulated using an individual-based mathematical model of malaria transmission. The impact of systemic insecticides when used to complement existing vector control and drug campaigns is evaluated in three settings—a highly seasonal high-transmission setting, a near-elimination setting with seasonal travel to a high-risk area, and a near-elimination setting in southern Africa.


At 60% coverage, a single round of long-lasting systemic insecticide with effective duration of at least 60 days, distributed at the start of the season alongside a seasonal malaria chemoprevention campaign in a high-transmission setting, results in further burden reduction of 30–90% depending on the sub-populations targeted. In a near-elimination setting where transmission is sustained by seasonal travel to a high-risk area, targeting high-risk travellers with systemic insecticide with effective duration of at least 30 days can result in likely elimination even if intervention coverage is as low as 50%. In near-elimination settings with robust vector control, the addition of a 14-day systemic insecticide alongside an anti-malarial in mass drug administration (MDA) campaigns can decrease the necessary MDA coverage from about 85% to the more easily achievable 65%.


While further research into the safety profile of systemic insecticides is necessary before deployment, models predict that long-lasting systemic insecticides can play a critical role in reducing burden or eliminating malaria in a range of contexts with different target populations, existing malaria control methods, and transmission intensities. Continued investment in lengthening the duration of systemic insecticides and improving their safety profile is needed for this intervention to achieve its fullest potential.