Skip to main content

Looking beyond the numbers: quality assurance procedures in the Global Network for Women’s and Children’s Health Research Maternal Newborn Health Registry

Abstract

Background

Quality assurance (QA) is a process that should be an integral part of research to protect the rights and safety of study participants and to reduce the likelihood that the results are affected by bias in data collection. Most QA plans include processes related to study preparation and regulatory compliance, data collection, data analysis and publication of study results. However, little detailed information is available on the specific procedures associated with QA processes to ensure high-quality data in multi-site studies.

Methods

The Global Network for Women’s and Children’s Health Maternal Newborn Health Registy (MNHR) is a prospective population-based registry of pregnancies and deliveries that is carried out in 8 international sites. Since its inception, QA procedures have been utilized to ensure the quality of the data. More recently, a training and certification process was developed to ensure that standardized, scientifically accurate clinical definitions are used consistently across sites. Staff complete a web-based training module that reviews the MNHR study protocol, study forms and clinical definitions developed by MNHR investigators and are certified through a multiple choice examination prior to initiating study activities and every six months thereafter. A standardized procedure for supervision and evaluation of field staff is carried out to ensure that research activites are conducted according to the protocol across all the MNHR sites.

Conclusions

We developed standardized QA processes for training, certification and supervision of the MNHR, a multisite research registry. It is expected that these activities, together with ongoing QA processes, will help to further optimize data quality for this protocol.

Plain english summary

All research studies should have quality assurance, as this protects the rights and safety of study participants. It also improves the quality of data collection and improves the likelihood of obtaining true results. Most quality assurance plans cover a variety of topics, including the preparation prior to the study, ethics issues, analysis of data and publication of study results. There is limited information on the procedures that different studies use to ensure data of high quality.

The procedures that we report here were done as part of the The Global Network for Women’s and Children’s Health Maternal Newborn Health Registy (MNHR), registry of pregnancies and deliveries that is carried out in 8 international sites. The MNHR has adoptes quality assurance procedures since its beginning, to ensure the quality of the data. More recently we developed a training and certification process to ensure that the same clinical definitions be used across these eight sites. The study staff are trained on these defitions with a web-based training module which covers the study protocol, study forms and clinical definitions. They are later certified before initiating study activities and every six months thereafter. The MNHR also carries out a procedure for supervision and evaluation of field staff to ensure that research activites are conducted according to the protocol across all sites. We expect that all of these activities will help to further optimize data quality for this protocol.

Background

Quality assurance (QA) is a process that should be carried out throughout all phases of research to protect the rights and safety of study participants, to improve consistency in data and to reduce the likelihood that trial results are affected by bias [1, 2]. QA seeks to ensure that studies comply with research standards, to detect problems early through routine monitoring and to correct issues through prompt and effective action [3]. Such processes should be considered a standard part of all research activities.

Most QA plans include processes related to study preparation and regulatory compliance, data collection, data analyses and publication of study results [1]. Additionally, multi-site studies generally include common variables, data collection methodologies and standardized protocols. However, although there is consensus on the importance of data quality for research, little detailed information is available on the specific procedures and best practices for QA processes [4]. One research study from India recently published a data quality assurance protocol, which focused on tools to ensure the accuracy, reliability, timeliness, completeness, precision, and integrity of the data [5]. The investigators found that the tools helped increase accuracy of data collection throughout the research project. With the increasing global emphasis on harmonization and data sharing in research, ensuring not only high quality of data but also comparability of data across diverse settings is critical to accurate interpretation of findings [6, 7].

The Maternal Newborn Health Registry (MNHR) is a prospective, population-based registry of pregnancies and deliveries conducted under the auspices of the Global Network for Women’s and Children’s Health Research (GN), a multi-country research network. The MNHR enrolls approximately 60,000 pregnant women each year and follows them from early pregnancy through the postnatal period [8, 9]. Briefly, its primary purpose is to quantify and analyze trends in pregnancy outcomes over time across the GN research sites. It also serves as a data collection tool for capturing pregnancies, perinatal and neonatal outcomes for individual studies [10, 11] and provides data to plan future GN studies. Additionally, MNHR data are frequently provided to local health officials who use it to inform and improve clinical care.

Because the MNHR operates in multiple sites in diverse low and middle-income countries (LMIC) and collects sensitive data on a large scale, QA has been particularly integral to ensuring quality data collection since the Registry’s inception in 2008. At the start of the MNHR, the investigators determined the critical data to collect, developed common definitions based on the WHO criteria, and also defined common methods to collect the data. In addition, the MNHR introduced a process of ongoing data quality monitoring, including metrics to assess missingness and accuracy, which was conducted both with local research teams and centrally, with a process for rapid feedback and resolution of data issues (Table 1) [12].

Table 1 Elements of the quality assurance plan for the Global Network’s Maternal and Newborn Health Registry

In 2017, in an effort to continously improve the quality of data in the MNHR and to further ensure comparability across diverse sites, we identified a need to standardize the clinical data collected by field staff across all sites. To address this gap, we developed additional procedures for training, certification and supervision of all staff within the MNHR. In this this paper, we describe the development of tools to support the standardization and enhanced QA of data collection including a web-based standardized training and certification procedure and an evaluation tool for consistent supervision of field staff at all GN sites. Additionally, we describe our proposed approach to evaluate the feasibility and effectiveness of these newly-implemented procedures.

Methods

Setting and background

The GN MNHR was established in research sites based in Argentina, Guatemala, India (two sites), Kenya, Pakistan and Zambia [5]. A site in the the Democratic Republic of the Congo (DRC) replaced the site in Argentina in 2013 and a Bangladeshi site was added in 2019. Investigators from each participating site and a partner United States-based institution, the NICHD and the Data Coordinating Center (DCC) based at RTI International, comprise the GN MNHR subcommittee that oversees all aspects of protocol design, study implementation, data analyses and publications. The GN MNHR is conducted under the auspices of the GN, a multi-disciplinary research network, which is supported by research grants from the National Institute of Child Health and Human Development (NICHD).

For each site, the Principal Investigator (PI; based at a United States university), and the Senior Foreign Investigator (SFI; based at an international research site) are collectively responsible for ensuring the overall quality of the site’s data. A country coordinator (CC) provides study oversight in the field; one or more study supervisors train and supervise registry administrators (RAs). RAs have a variety of backgrounds, and have various types and levels of professional training (i.e., midwives, community health workers, physicians, nurses, medical technologists, health assistants, and social workers); however, a minimal requirement is community health worker level experience. They work with a wide range of public and private providers and collect data in multiple settings including participants´ homes and health institutions (Fig. 1). The RAs are paid study staff with participation in ongoing training considered part of their research responsibilities.

Fig. 1
figure 1

Maternal Newborn Health Registry study organization at site

QA procedures in place since the inception of the MNHR

Study preparation and regulatory compliance

In preparation for implementation of the MNHR, each of the GN study sites obtained approval of the study protocol, consent and case report forms (CRFs) from their respective Institutional Review Board both in the US and in-country. All research staff were certified in the Protection of Human Subjects and Good Clinical Practices. In accordance with NICHD policy, the study protocol, manual of procedures (MOP) and CRFs are publicly available through the GN website (https://gn.rti.org/); de-identified study data is available for secondary analyses through the NICHD Data and Specimen (N-DASH) hub (https://dash.nichd.nih.gov/).

Data collection

Standard operating procedures for data collection are detailed in the MNHR MOP which has been in place since study inception; it is reviewed annually and updated as needed. A question by question (QxQ) document defines each study question and is updated as needed; a policy document includes technical memos that describe new study procedures and outlines the addition or elimination of variables.

Data processing, analysis and publication

All MNHR data are entered locally into a computer-based data management system (DMS) that incorporates inter and intra-form data quality checks. The DCC produces monthly monitoring reports which detail site and cluster specific metrics to identify issues related to completeness of data collection and to review changes in outcomes measure quality improvements over time. Specifically, the reports focus on critical data, as defined by the central working group to ensure the completeness and accuracy of those data elements. Each site reviews these monthly monitoring reports and participates in quarterly monitoring calls [8]. Study data are reviewed annually by a Data Monitoring Committee. A publication management system outlines specific procedures for publication of study results.

QA procedures

In an effort to continuously improve the quality of our data collection, we developed a standardized training and certification process for RAs as well as standardized supervisory procedures.

Training and certification

To improve data collection, including the use of standardized clinical definitions, we developed a web-based, interactive training module for MNHR RAs.

Development and pilot evaluation

Prior to developing the module, we developed standardized definitions for all clinical data collected in the MNHR through an iterative process. First, we reviewed the MNHR CRFs and identified all clinical data fields (Table 2). We then conducted extensive searches for definitions of these terms using a variety of sources including the WHO and United Nations Population Fund websites as well as technical documents, clinical textbooks and peer reviewed journals. Based on these definitions, we compiled draft definitions for the MNHR that were reviewed by the GN MNHR subcommittee and study site investigators, including obstetric and pediatric specialists in the GN, for scientific accuracy. The subcommittee further refined these definitions during an in-person discussion to ensure they reflected the diagnostic capability of current healthcare at the MNHR research sites. For example, the final MNHR definition for malaria does not require confirmatory laboratory testing, as GN sites with high malaria prevalence diagnose the disease clinically without routinely performing confirmatory laboratory testing. Similarly, the definition of neonatal sepsis does not require confirmation with blood culture. Lastly, a medical editor reviewed the final draft definitions, also ensuring the literacy level was appropriate for the level of medical training required of MNHR RAs. As the next step, the modules were pilot tested with a sample of learners from the sites. These staff provided feedback on the definitions as well as the use of pictorial images.

Table 2 Clinical processes of care and health outcomes collected on Maternal and Newborn Health Registry forms, 2019

Implementation of web-based QA

Using these standardized clinical definitions, study protocol and MOP, the Instituto de Nutrición de Centro América y Panamá (INCAP) developed a web-based training module with Storyline 360 (https://360.articulate.com) software, which supports the development of interactive courses for all types of computers and devices. This training module was designed using andragogic learning principles for adults, facilitating knowledge acquisition by linking new concepts to previous experiences and prior knowledge [13]. The course contextualizes the learning process to the MNHR setting, so that learners can establish an immediate link between theory (such as clinical definitions) and its practical application. The module covers the study objectives, protocol, and CRFs including instructions for use and standardized definitions for all clinical data fields.

This web-based module is available to all sites through the GN webpage, which can be accessed through study computers or tablets as well as in an off-line mode. The module engages learners through key information and communication technology. It was developed in English and translated into Spanish and French, with subtitles appearing throughout the course. If additional languages are required, the local staff provide translations. Clinical definitions are communicated with multi-media, using text, audio, and an image illustrating the concept. Brief quizzes, which require matching the definition to the concept after review of every three clinical terms, are interspersed throughout the module to help learners assess their comprehension. Finally, the course highlights achievements of the learner as he/she completes each section. Techincal assistance is provided centrally by RTI staff, as needed, to complete the modules.

Following the training module, RAs complete a multiple-choice certification exam (available to all sites through the GN webpage or via an electronic copy on a storage device). Initial certification, and recertification, require a minimum score of 80% on this exam. If an RA does not attain this score, he/she receives additional training from the country coordinator and repeats the training module before re-taking the exam. All RAs recertify by obtaining a passing grade on the certification exam every six months. Additionally, RAs have access to the module for additional training whenever necessary. Altogether, the training and certification process takes approximately 8–10 h for each staff to complete.

Supervisory visits

To improve data collection and study implementation, the subcommittee also standardized supervisory procedures of RAs across sites.

First, each site submitted a description of their site-specific supervisory evaluation process, including any corresponding forms. Based on these descriptions and the recommendations for implementation of QA processes, the subcommittee developed a standardized supervisory procedure with corresponding CRF for usage by all sites in the MNHR. This CRF assesses general activities of the RA, communication with other health providers and a field visit, including a key variable check. Country coodinators piloted the supervisory process and forms in each site. They informally found the proposed procedures and frequency of supervision to be feasible. Additionally, they gave feedback on the data selected for variable checks such as using maternal height instead of weight since the latter can fluctuate between when an RA records the data and a supervisor confirms it. Final variables were selected based on their likelihood of being easily recalled by the mother and readily obtained during a supervisory home visit, and included such items as maternal height, delivery location, and mode of delivery.

The final, standardized supervisory procedure occurs as follows:

Annually, all RAs undergo two supervisory visits consisting of three parts, which are recorded on a study form. The supervisor arrives in the community where the RA is scheduled and corroborates that he/she is prepared with consents, study forms and equipment. The supervisor visits community level or ministry of health staff to corroborate that the RA interfaces regularly and appropriately with them. Finally, the supervisor visits two randomly selected MNHR participants to corroborate data in the DMS for specific key variables. Supervisors review the findings of these supervision processes with each RA. If necessary, RAs are provided additional training targeting non-compliance with study procedures or errors in data collection through the training module or direct coaching by supervisors.

Evaluation plan of recently implemented QA procedures

Country coordinators ensure that all RAs complete training and certification procedures and report on these processes to the DCC, which keeps a log of these data. Data collected through the supervisory process are entered into the DMS and analyzed by the DCC. Results of both are discussed on periodic site calls.

To evaluate the impact of these QA methods, the subcommittee will review the training and certification procedure for feasibility (such as the number of RAs trained and average time for course completion) and effectiveness (such as certification exam scores and first-time pass rate). Through data collected during our supervisory procedure, we will evaluate RA fidelity to the study protocol and quality of data collected (such as percent of data elements in the DMS that is congruent with participant-reported data for each key variable). We will track congruency of all key variables on the CRF over time to determine whether our QA training is successful at improving the quality of the data. Additionally, the central QA team reviews the content and updates, as needed, on a bi-annual basis.

Discussion

QA is a necessary part of conducting research; best practices recommend that it includes measures to prevent, detect and correct errors from the beginning of data collection through the publication of study results. QA also helps ensure that data are accurate and collected using common methods across sites. This is particularly challenging in multi-site, large-scale studies such as the MNHR. Multi-site studies need to ensure standardization of definitions across diverse settings to help ensure generalizability of findings. In addition to the standardized definitions and procedures, other studies have emphasized the importance of leveraging local capacity with central technical support, similar to the model of the GN [14, 15]. Additionally, as with the GN, the use of ongoing data metrics in routine monitoring reports have been shown to improve data quality [5, 16].

The MNHR includes participants in eight diverse LMIC settings, enrolling approximately sixty thousand participants annually. Since its inception ten years ago, we designed QA procedures for data collection, entry, editing and transmission. In 2012, we added metrics and standardized reports to facilitate regular identification of site-specific implementation issues. These QA procedures have supported the successful enrollment of 704,265 participants with over 95% followed from pregnancy through delivery [9].

Building upon the existing platform of common definitions and ongoing data monitoring, in an effort to continuously improve the quality of our data in 2017 we implemented standardized training and certification as well as supervisory procedures with a number of notable strengths. Our training and certification procedure includes newly-developed, standardized definitions for all clinical data collected in the MNHR. The web-based training module utilizes andragogic learning principles and capitalizes on information and communication technology to engage learners. Recertification of RAs ensures that initial proficiency in study procedures is maintained. The supervisory procedure facilitates the detection of non-compliance with study procedures as well as errors in data collection, reporting and entry. All GN sites have successfully implemented these training, certification and supervisory procedures, and given the relatively low additional burden in terms of time and other resources, have found that they are feasible to continue. We thus view these changes in procedures to be strengths that will enhance the utility of this registry for its multiple intended purposes.

Despite these strengths, there are limitations to our newly-implemented QA procedures. The MNHR data are collected via medical record abstraction and interview of participants. While we developed standardized clinical definitions to support accurate data collection, these definitions do not ensure accurate ascertainment and documentation of clinical diagnoses by health care providers, nor do they ensure accurate maternal recall. Similarly, while we attempted to verify variables most likely to be recalled by the mother during supervisory visits, maternal recall is an imperfect ‘gold standard’.

In this manuscript, we have described a comprehensive QA procedure that may serve as a model for other multi-site large-scale studies. Through routine MNHR review processes, we identified an opportunity to improve training and oversight with the goal of assuring strong adherence to a protocol implemented in diverse low resource settings. On-going data collection regarding these processes will allow us to evaluate their feasibility and effectiveness, and to determine critical elements for maintaining high quality data for this and other similar registry—based protocols.

Conclusion

In conclusion, especially for large, multi-site clinical research studies in LMICs, the ability to harmonize data across diverse settings presents challenges, limiting the ability to compare site results. Providing a standardized training across the sites together with reinforced supervision and oversight that is centrally monitored has proven useful in improving quality of data. The approaches used to facilite QA across the MNHR have applicability across other multi-site research studies, which have particular challenges in ensuring common methodologies and interpretation of data.

Availability of data and materials

Data from the study will be available at the NICHD data repository (N-DASH): https://dash.nichd.nih.gov/.

Abbreviations

QA:

Quality assurance

MNHR:

Maternal Newborn Health Registy

GN:

Global Network for Women’s and Children’s Health Research

NICHD:

Eunice Kennedy Shriver National Institute of Child Health and Human Development

LMIC:

Low and middle-income countries

DRC:

Democratic Republic of the Congo

DCC:

Data coordinating center

SFI:

Senior foreign investigator

CC:

Country coordinator

RAs:

Registry administrators

CRFs:

Case report forms

MOP:

Manual of procedures

N-DASH:

NICHD Data and Specimen

QxQ:

Question by question

DMS:

Data management system

INCAP:

Instituto de Nutrición de Centro América y Panamá

References

  1. Béghin L, Castera M, Manios Y, et al. Quality assurance of ethical issues and regulatory aspects relating to good clinical practices in the HELENA Cross-Sectional Study. Int J Obes. 2008;32:S12–S1818. https://0-doi-org.brum.beds.ac.uk/10.1038/ijo.2008.179.

    Article  Google Scholar 

  2. Morrison BW, Cochran CJ, White JG, et al. Monitoring the quality of conduct of clinical trials: a survey of current practices. Clin Trials. 2011;8:342–9. https://0-doi-org.brum.beds.ac.uk/10.1177/1740774511402703.

    Article  PubMed  Google Scholar 

  3. Knatterud GL, Rockhold FW, George SL, et al. Guidelines for quality assurance in multicenter trials: a position paper. Control Clin Trials. 1998;19:477–93. https://0-doi-org.brum.beds.ac.uk/10.1016/S0197-2456(98)00033-6.

    Article  CAS  PubMed  Google Scholar 

  4. Faizi N, Kumar AM, Kazmi S. Omission of quality assurance during data entry in public health research from India: Is there an elephant in the room? Indian J Public Health. 2018;62(2):150–2.

    PubMed  Google Scholar 

  5. Gass JD Jr, Misra A, Yadav MNS, Sana F, Singh C, Mankar A, et al. Implementation and results of an integrated data quality assurance protocol in a randomized controlled trial in Uttar Pradesh, India. Trials. 2017;18(1):418.

    Article  Google Scholar 

  6. Costeloe K, Turner MA, Padula MA, Shah PS, Modi N, Soll R, International Neonatal Consortium, et al. Sharing data to accelerate medicine development and improve neonatal care: data standards and harmonized definitions. J Pediatr. 2018;203:437–441.e1.

    Article  Google Scholar 

  7. Kumar S, Dave P, Srivastava A, Stekelenburg J, Baswal D, Singh D, et al. Harmonizing scientific rigor with political urgency: policy learnings for identifying accelerators for scale-up from the safe childbirth checklist programme in Rajasthan, India. BMC Health Serv Res. 2019;19(1):273.

    Article  Google Scholar 

  8. Goudar SS, Carlo WA, Mcclure EM, et al. The Maternal and Newborn Health Registry Study of the Global Network for Women’s and Children’s Health Research. Int J Gynaecol Obs. 2012;118:190–3. https://0-doi-org.brum.beds.ac.uk/10.1016/j.ijgo.2012.04.022.

    Article  Google Scholar 

  9. Bose CL, Bauserman M, Goldenberg RL, et al. The Global Network Maternal Newborn Health Registry: a multi-national, community-based registry of pregnancy outcomes. Reprod Health. 2015. https://0-doi-org.brum.beds.ac.uk/10.1186/1742-4755-12-S2-S1.

    Article  PubMed  PubMed Central  Google Scholar 

  10. Hoffman MK, Goudar SS, Kodkany BS, et al. A description of the methods of the aspirin supplementation for pregnancy indicated risk reduction in nulliparas (ASPIRIN) study. BMC Pregnancy Childbirth. 2017;17(1):135. https://0-doi-org.brum.beds.ac.uk/10.1186/s12884-017-1312-x.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  11. Pasha O, Goldenberg RL, McClure EM, et al. Communities, birth attendants and health facilities: a continuum of emergency maternal and newborn care (the global network’s EmONC trial). BMC Pregnancy Childbirth. 2010. https://0-doi-org.brum.beds.ac.uk/10.1186/1471-2393-10-82.

    Article  PubMed  PubMed Central  Google Scholar 

  12. Goudar SS, Stolka KB, Koso-Thomas M, et al. Data quality monitoring and performance metrics of a prospective, population-based observational study of maternal and newborn health in low resource settings. Reprod Health. 2015. https://0-doi-org.brum.beds.ac.uk/10.1186/1742-4755-12-S2-S2.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Knowles MS, Malcolm S, Holton EF, Swanson RA. The adult learner: the definitive classic in adult education and human resource development. Elsevier; 2005. https://books.google.com.gt/books?id=J6qGsHBj7nQC&redir_esc=y&hl=es. Accessed 6 Aug 2019.

  14. Yamanaka A, Fialkowski MK, Wilkens L, et al. Quality assurance of data collection in the multi-site community randomized trial and prevalence survey of the children’s healthy living program. BMC Res Notes. 2016;9:432.

    Article  Google Scholar 

  15. Rosa C, Campbell A, Kleppinger C, Sampson R, Tyson C, Mamay-Gentilin S. Quality assurance of research protocols conducted in the community: the National Institute on Drug Abuse Clinical Trials Network Experience. Clin Trials. 2009;6(2):151–61. https://0-doi-org.brum.beds.ac.uk/10.1177/1740774509102560.

    Article  PubMed  PubMed Central  Google Scholar 

  16. Giganti MJ, Shepherd BE, Caro-Vega Y, Luz PM, Rebeiro PF, Maia M, et al. The impact of data quality and source data verification on epidemiologic inference: a practical application using HIV observational data. BMC Public Health. 2019;19(1):1748.

    Article  Google Scholar 

Download references

About this supplement information

This article has been published as part of Reproductive Health, Volume 17 Supplement 2, 2020: Global Network MNH. The full contents of the supplement are available at https://0-reproductive--healthjournal-biomedcentral-com.brum.beds.ac.uk/articles/supplements/volume-17-supplement-2

Funding

Publication of this supplement is funded by grants from Eunice Kennedy Shriver National Institute of Child Health and Human Development NICHD to the participating sites and to RTI International.

Author information

Authors and Affiliations

Authors

Contributions

All authors reviewed and approved the final manuscript.

Corresponding author

Correspondence to Ana Garces.

Ethics declarations

Ethics approval and consent to participate

This study was reviewed and approved by the institutional review boards of participating institutions, including the University of Colorado and the Faculty of Medicina, Universidad Francisco Marroquin, Guatemala. All women provided informed consent prior to enrollment in the study.

Consent for publication

The article was approved for publication by NICHD through its clearance mechanism.

Competing interests

The authors have no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Garces, A., MacGuire, E., Franklin, H.L. et al. Looking beyond the numbers: quality assurance procedures in the Global Network for Women’s and Children’s Health Research Maternal Newborn Health Registry. Reprod Health 17 (Suppl 2), 159 (2020). https://0-doi-org.brum.beds.ac.uk/10.1186/s12978-020-01009-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://0-doi-org.brum.beds.ac.uk/10.1186/s12978-020-01009-3

Keywords