- Study protocol
- Open Access
- Open Peer Review
A cluster randomized controlled trial to evaluate the effectiveness of the clinically integrated RHL evidence -based medicine course
Reproductive Health volume 7, Article number: 8 (2010)
Background and objectives
Evidence-based health care requires clinicians to engage with use of evidence in decision-making at the workplace. A learner-centred, problem-based course that integrates e-learning in the clinical setting has been developed for application in obstetrics and gynaecology units. The course content uses the WHO reproductive health library (RHL) as the resource for systematic reviews. This project aims to evaluate a clinically integrated teaching programme for incorporation of evidence provided through the WHO RHL. The hypothesis is that the RHL-EBM (clinically integrated e-learning) course will improve participants' knowledge, skills and attitudes, as well as institutional practice and educational environment, as compared to the use of standard postgraduate educational resources for EBM teaching that are not clinically integrated.
The study will be a multicentre, cluster randomized controlled trial, carried out in seven countries (Argentina, Brazil, Democratic Republic of Congo, India, Philippines, South Africa, Thailand), involving 50-60 obstetrics and gynaecology teaching units. The trial will be carried out on postgraduate trainees in the first two years of their training. In the intervention group, trainees will receive the RHL-EBM course. The course consists of five modules, each comprising self-directed e-learning components and clinically related activities, assignments and assessments, coordinated between the facilitator and the postgraduate trainee. The course will take about 12 weeks, with assessments taking place pre-course and 4 weeks post-course. In the control group, trainees will receive electronic, self-directed EBM-teaching materials. All data collection will be online.
The primary outcome measures are gain in EBM knowledge, change in attitudes towards EBM and competencies in EBM measured by multiple choice questions (MCQs) and a skills-assessing questionniare administered eletronically. These questions have been developed by using questions from validated questionnaires and adapting them to the current course. Secondary outcome measure will be educational environment towards EBM which will be assessed by a specifically developed questionnaire.
The trial will determine whether the RHL EBM (clinically integrated e-leraning) course will increase knowledge, skills and attitudes towards EBM and improve the educational environment as compared to standard teaching that is not clinically integrated. If effective, the RHL-EBM course can be implemented in teaching institutions worldwide in both, low-and middle income countries as well as industrialized settings. The results will have a broader impact than just EBM training because if the approach is successful then the same educational strategy can be used to target other priority clinical and methodological areas.
Research is constantly increasing medical knowledge. For this increased knowledge to be useful in improving the health of populations it should be added to the existing knowledge pool, critically evaluated for validity and implemented by health care professionals. In recent years it is increasingly recognized that improvements in knowledge and skills of health care personnel and improvements in health care outcomes is best accomplished by employing evidence-based medicine (EBM) as part of everyday work. EBM promotes health care decisions based on the current best (valid and relevant) evidence. To achieve this, EBM curricula need to inculcate amongst learners the skills to gain, assess, apply, integrate and communicate new scientific knowledge into clinical decision-making. There is much debate about the effectiveness of various EBM teaching and learning methods and outcomes. Computer-based learning has been shown to be as effective as face-to face, lecture based sessions in improving knowledge [1, 2]. There is a lack of consensus, however, as to what methods constitute the best educational practice: a practice that results not only in improvement of basic educational outcomes, such as knowledge and appraisal skills but also in attitudes and behaviour, which, ultimately leads to improved practice [3, 4].
Empirical and theoretical evidence suggests that there is a hierarchy of teaching and learning activities in terms of their educational effectiveness: Level 1, interactive and clinically integrated activities; Level 2(a), interactive but classroom based activities; Level 2(b), didactic but clinically integrated activities; and Level 3, didactic, classroom or standalone teaching . Workshop-based teaching, probably one of the most common forms of postgraduate teaching, can at best achieve level 2. Workshops can be interactive but it is the provision of clinically integrated activities that seem to have more sustained effects and have advantages over workshop-based programmes. Such programmes are few and variable in their quality.
A clinically integrated EBM course for teaching postgraduates was developed by the European Union-EBM Unity Project, a consortium of eleven European partners within the framework of Leonardo da Vinci vocational programme of the European Union http://ebm-unity.pc.unicatt.it/. This course is learner-centred, problem-based and has integrated e-learning with clinical activities in the workplace. When successfully implemented, the course is designed to provide just-in-time learning through on-the-job-training, with the potential for teaching and learning to directly impact on practice and workplace environment. The course was formally piloted for feasibility in five European countries, in different medical specialties and different languages and tested in a cluster randomized controlled trial in two countries with promising results [6, 7]. However, the generic nature of this course makes application in specific clinical specialties tedious. There is a need to develop specialty- specific courses and to further evaluate this approach rigorously in appropriate settings.
The World Health Organization has published the WHO Reproductive Health Library (RHL), an evidence-based specialist library in sexual and reproductive health since 1997. RHL is disseminated extensively in low and middle-income countries to approximately 15,000 users every year. RHL contents are Cochrane systematic reviews in high priority topics mainly in maternal and perinatal health and fertility regulation complemented by commentaries, practical guidance documents and educational videos on techniques to implement effective practices.
In addition, a course on Evidence-based Decision-making in Reproductive Health including RHL content has been developed in past years as part of capacity strengthening efforts. This course includes six powerpoint presentations on evidence-based decision-making and a manual with the presentations. Several workshops have been conducted using these materials in the past seven years.
The EU-EBM Unity Project materials were revised to adapt into a sexual and reproductive health context and include RHL as a resource. The revision has made this innovative approach to postgraduate teaching suitable for implementation at obstetrics and gynaecology departments in low and middle-income countries that collaborate with HRP/RHR. As mentioned above, the promising nature of the approach and the relatively weak evidence base to recommend it widely is the principal reason behind the development of this protocol to conduct a RCT to evaluate its effects on various relevant outcomes.
Educational activities within a clinical environment primarily aim to improve the knowledge, attitudes and skills of clinical staff that will lead to improved health care practices and health outcomes of the patients. Given the multitude of educational activities that come in various formats (i.e. didactic, interactive, face-to-face, electronic (1)) it should be an implicit aim of these activities that they result in creating an environment that is more conducive to in-service learning in addition to their more immediate objectives. The context in which these educational activities and learning by staff and students take place is defined as the 'educational environment'. A single educational project may not cause major changes in the educational environment and many other factors related to health system functioning and cultural characteristics may play a role in making the environment more educationally-friendly or not. Nevertheless, measuring the educational environment is important both as an end-point as well as an explanatory factor for evaluating educational interventions.
Rationale for this trial
Evidence-based reproductive health care requires clinicians to engage with use of evidence in decision-making. To genuinely get involved, healthcare professionals need to effectively incorporate contemporaneous research-based information in the clinical setting. The proposed research project represents part of a continued programme of activities at HRP/RHR aimed at improving the quality of care and capacity-strengthening for evidence-based decision-making in low and middle-income countries. Such research is timely and important because health care workers in many under-resourced settings face difficulties in providing good quality care, keeping their knowledge up-to-date and having the skills to interpret and implement this knowledge. If effective, the clinically integrated e-learning approach can be scaled up without major investment at teaching institutions worldwide. The studies available to date have been exclusively conducted in developed countries. Before engaging in large scale implementation the benefits of this integrated approach should be demonstrated through rigorous research in relevant settings. The hypothesis is that the proposed RHL-EBM integrated e-learning course (experimental group) will improve participants' knowledge, skills and attitude, as well as institutional practice and educational environment, as compared to the use of standard postgraduate educational resources for EBM teaching.
Previous similar studies
We identified a systematic review including 23 studies comparing standalone with clinically integrated teaching in postgraduate teaching in EBM . Most studies reported on knowledge, fewer on skills, (i.e. critical appraisal skills) attitudes and behaviour towards EBM. None of the studies evaluated health outcomes. Results showed that both standalone and clinically integrated courses can improve knowledge. Clinically integrated courses also improved skills, attitudes and behaviour towards EBM. However, the studies included used different teaching methods and assessment tools, and no study compared directly standalone with integrated approach. While the evidence of the review looks promising, more robust evaluation is needed comparing standalone with clinically integrated teaching.
A recent systematic review assessed the effectiveness of EBM teaching on knowledge, skills and attitude/behaviour regarding EBM  in postgraduates. The review included 24 studies of different designs, all studies were conducted in developed countries, and most had small sample size and provided little detail about assessment tools and actual intervention. These weaknesses point towards a need for appropriate sized randomized controlled trials and trans-culturally adapted measurement instruments.
The main objective is to evaluate whether the RHL-EBM course is effective in improving knowledge, skills and competencies as compared to passive dissemination of resource materials. A secondary objective is to validate an EBM educational environment tool.
A cluster randomized design is proposed. Such a design is more appropriate for this intervention since the intervention will be implemented at the teaching unit level and the outcome assessments will be measured similarly at the institutional level. In addition, individual randomization can invalidate such an intervention due to contamination , whereby control participants have access to and use experimental intervention to enhance their learning if they are working in the same environment.
Participants are training institutions in obstetrics and gynaecology in the participating countries. Hospitals belonging to the same training institution will form the cluster. Principal investigators from each country will determine the number of potentially eligible training units/clusters in their countries by contacting the head of the specialist training unit and the local authorities (if required). A baseline survey of staffing levels and current postgraduate teaching programmes has been conducted to determine the number of units that can meet eligibility criteria.
Training units that can provide at least four junior residents/registrars (postgraduate trainees) who have not yet been exposed to structured, formal EBM training and who will be available for the duration of the trial will be eligible. Brief descriptive characteristics of each participating junior postgraduate trainee will be collected.
Each institution needs to identify a 'facilitator' who will be able to work with the participants during the trial period. The facilitator will be someone working in the same department with the participants but will be a senior staff member preferably a specialist in obstetrics and gynaecology who is knowledgeable about basic EBM principles.
Recruitment and allocation
The training institutions (clusters) identified by the principal invstigators and fulfilling the inclusion criteria will be asked to participate in the trial. The random allocation scheme will be generated at HRP/RHR using computer-generated random numbers. Each cluster will be allocated to either of two groups: RHL-EBM course (intervention 1) or passive dissemination of EBM teaching materials (intervention 2). Due to the nature of the intervention and design it will not be possible to blind the research teams or the participants of their allocated group. However, the participants will only be informed about an educational programme evaluation and will not be told which groups are compared. The individual trainees will receive similar information on a written information sheet.
The project aims to evaluate the effects of the experimental intervention (RHL-EBM clinically integrated e-learning course) over passive dissemination of resources that have very similar content and the same learning objectives. It is acknowledged that other EBM teaching activities can also take place independently at the participating institutions. Information on those activities will be recorded systematically. The contents and presentation of both intervention courses were refined and finalized at two meetings of the steering committee that includes experts in research methodology, medical education and obstetrics and gynaecology.
Intervention 1: RHL-EBM Course
The RHL-EBM clinically integrated e-learning course will be the experimental intervention. The course content has been developed following well established guidelines when planning a curriculum . The Course is described in detail in Tables 1 and 2. Briefly, the teaching takes place in the clinical environment and the postgraduate trainee obtains the theoretical knowledge from the interactive e-learning materials, completes assignments and interacts with her/his facilitator throughout the process.
Descriptive information about facilitators' position in the unit, professional qualification and possible EBM teaching activities will be collected. There should be one facilitator per group of 5 to 10 postgraduate trainees. At each training institution one facilitator will be responsible for trial related activities. Facilitators will interact with the participants providing feedback on their assignments, and facilitate discussions during ward rounds/discussions and journal clubs. Facilitators will receive a handbook that will help them to guide the participant.
Intervention 2: Self-directed learning (Passive dissemination of EBM teaching materials)
The RHL-EBM Course will be compared to passively disseminated EBM resources from the WHO RHL workshop-based course that has the same learning objectives. The materials include 6 PowerPoint presentations. The content of the presentations are similar to the content of the RHL-EBM Course. It is detailed described in Table 3. The PowerPoint presentations focus on:
formulating clinical answerable questions;
searching effectively for evidence using tools such as the Reproductive Health Library and the Cochrane Library;
critically appraise clinical evidence for its validity and applicability;
understanding basic effect measures such as Relative Risk and Numbers Needed to Treat.
The basic differences between experimental (intervention 1) and control (intervention 2) groups are outlined in Table 4.
The primary outcome measures are:
Gain in EBM knowledge measured by multiple choice questions (MCQs) that have been shown to have face/content validity in relation to course objectives and discrimination capacity in previous studies. Multiple choice questions have been developed by using questions from validated questionnaires and adapting them to the current course [4, 12, 13].
Change in attitudes towards EBM and competencies in EBM. Attitudes towards EBM will be measured by a validated questionnaire. Previously validated questions to assess attitudes towards EBM will be used .
Skills competence in applying EBM principles will be measured by Objective Structured Clinical Examination (OSCE). The OSCE is a well established way to assess clinical competency. In this case, EBM-skills related questions following the OSCE structure will be developed for completion electronically.
As a secondary outcome measure, we will assess the educational environment towards EBM by using a questionnaire (EBMEEM). The EBMEEM development is based on previously validated environment measurement tools in different clinical settings [14, 15]. The trial will also serve to validate the tool in a larger setting. Trial participants will complete the questionnaire at the beginning and at the end of the course. Only data from the first completion (baseline) will be used for validation purposes. Data from the second administration at the end of the course will be analysed after determining the items to be included in the validated instrument. Items remaining in the validated instrument will be used to generate baseline scores for the educational environment and to monitor and compare any possible changes at the end of the trial.
Development of the EBMEEM tool
Using the format of previously developed tools the content of the RHL-EBM Course were adapted and included in a draft instrument that yielded 63 Likert-scale items. The items were organized around certain themes namely, learning opportunities; own learning (EBM competence); availability of leaning resources; teachers and teaching (EBM specific); supervision and support; EBM practice (EM atmosphere); and general atmosphere. After several iterations of pilot-testing involving postgraduate trainees in a number of countries and the Steering Committee, the draft instrument to be validated and used in outcome assessement was agreed upon.
Process outcomes such as the experience of the facilitators and the postgraduate trainees will be evaluated through interviews.
The intervention period will be 8-12 weeks. Currently, follow-up beyond the trial period is not planned mainly due to financial constraints. If funds are available a one-year assessment of the EBM educational environment will be considered.
Data will be collected in the centres using an online data management system developed by the Geneva Foundation for Medical Education and Research (GFMER). Data management will be based at the GFMER, Geneva, Switzerland supervised and monitored by HRP/RHR and Birmingham University.
Data collection will be at baseline (before the intervention), and after completion of the intervention. Data will be recorded by online completion of data forms (MCQs) and questionnaires (attitude, educational environment).
Data entry will be monitored on a day-to-day basis and any problems with logins, missing data and queries will be resolved promptly. The system allows data export to common statistical packages for analysis. It has been agreed that the data files will be exported to an Excel file to be analyzed by the trial statistician in the Statistics and Informatics Services Team at the HRP/RHR.
Since the random allocation is by teaching units and the inferences will be made at this level, the analysis at the teaching unit will be the unit of analysis.
At each participating unit the postgraduate trainees will use the online system to enter their personal data and access the assessment questions (all) and the e-learning modules of the course (either intervention or control materials). The postgraduate trainees will complete the relevant sections at baseline and at the end of the trial period.
Analysis plan for primary outcomes
A score for each individual by dividing the number of points obtained in a test by the maximal number of points that can be obtained will be computed. A summary measure will be obtained for each teaching unit as the mean score for all the individuals taking the test in that unit at baseline (pre-) and post-evaluation phase.
We will compare the knowledge (main outcome) between intervention and control units using analysis of covariance, with pre-test score as a baseline covariate and terms for the stratum and for the group in the model. A 95% confidence interval will be calculated for the difference in knowledge between intervention and control, based on the error term from the analysis of covariance.
We will also compute the change in knowledge as the difference between the post- and the pre-test scores for each individual. A summary measure of the gain in knowledge for each teaching unit will be computed as the mean gain in knowledge. We will compare the gain in knowledge between intervention and control units using analysis of variance, with terms for the stratum and for the group in the model. A 95% confidence interval will be calculated for the difference in gain in knowledge between intervention and control, based on the error term from the analysis of variance.
It is anticipated that in some countries there will be existing formal or informal postgraduate training activities on EBM. These will be reported in the survey of EBM activities that will be conducted before the trial. If data permits a stratified analysis based on the presence or absence of additional basic EBM courses will be conducted. However, it is acknowledged that the study may be underpowered to provide a definitive answer according to the two strata.
Number of subjects and statistical power
There is limited information regarding baseline EBM knowledge and possible knowledge gains and practice changes following implementation of such courses . The evaluation of the pilot course developed within the Leonardo da Vinci pilot project provided some data that were used in the sample size calculation [6, 7]. In the pilot evaluation only electronic modules were evaluated as a before and after analysis: For sample size calculations, the standard deviation (SD) for the average gain in knowledge (%) estimated from teaching units/modules means in different countries as SD = 7.9% (likely to be an overestimate) was taken. To allow for random variation of this estimate, a range of SD between 5% and 15% was considered. The power achieved with a total of 60 teaching units to detect a difference in gain of 10% (relative to the maximum score) between two groups in a two-sided test of 5% level, assuming SD = 10%, is 97%. If SD = 12%, the power is 90%. If SD = 14%, the power is 80%. Seven countries: Argentina (4 institutions), Brazil (20), Democratic Republic of Congo (4), India (10), Philippines (10), South Africa (6), Thailand (10) will participate in the trial whose duration is estimated in one year.
Main problems anticipated
Identification of facilitators in the experimental intervention units: The principal investigator will use her/his judgement on the selection of the facilitators and provide training if appropriate. However, the whole philosophy of the Course is for the facilitators to guide the students in the clinical setting and monitor their progress in the process. The facilitators do not need to be EBM experts but rather respected clinicians that could be seen as role models or opinion leaders within their settings.
Although all settings will have access to internet for data entry and course content (RHL-EBM) our initial contacts suggest that there may be connectivity problems in some settings like Congo DRC and South Africa. In these settings close monitoring by principal investigators will be required to ensure that emerging problems are solved rapidly.
Existing EBM courses
It is possible that in some of the teaching units there may be other postgraduate teaching activities such as workshops or scientific conferences. It is not possible to control these activities occurring in each setting. The survey conducted before the trial will give information on the scope of the activities likely to exist in each setting. There are two possible protective mechanisms against these activities becoming a threat to the validity of trial results. First, random allocation should ensure that such activities are equally distributed in both groups. Second, stratified analysis based on the presence and absence of existing courses is planned as a secondary analysis.
Expected outcomes of the study
If effective, the RHL-EBM Course will provide an educational intervention that can be scaled up in low and middle-income countries in both teaching and non-teaching institutions. As such it will form a core component to improve quality of care. Discussions with national and international professional organizations to implement the course with formal certification can be performed.
Improvements on knowledge gain, attitudes and skills are important for providing good quality care. Any potential impovement in educational environment will have a major impact on the quality of inservice training not only in resource-poor settings. An original EBM Educational Environment Measurement Tool will be available for all academic or nonacademic settings.
Trial results will be published in a peer-reviewed journal. The results will be disseminated to the participating centres through meetings of the research team with participants at the end of the study. The results will also be presented and disseminated through the WHO website and the WHO reproductive Health Library (RHL).
Ethics and informed consent
The trial interventions are implemented at the institutional level. Therefore, the primary point at which permission is sought is at the institutional level. Receiving permission from the institution does not constitute a consent on behalf of the participants. It constitutes an administrative agreement between the PI and the institution. A permission sheet will be presented to the person responsible for the institution to be signed. We anticipate that in most cases this person will be the academic head of the obstetrics and gynaecology department.
The facilitators/focal persons and the postgraduate trainees will be informed that one or more educational programme(s) are being evaluated but with no further details. After random allocation each unit will be informed about the activities they should follow including completion of MCQs and other online assessments. We plan to seek consent from individuals within the participating institutions. Individuals will not be identified in the questionnaire. If intervention 1 is shown to be beneficial in the primary endpoints then it will be provided to all intervention 2 institutions.
The study protocol was ethically aproved by WHO and will also be aproved in each participatin g country.
Davis J, Chryssafidou E, Zamora J, Davies D, Khan K, Coomarasamy A: Computer-based teaching is as good as face to face lecture-based teaching of evidence based medicine: a randomised controlled trial. BMC Med Educ. 2007, 7: 23-10.1186/1472-6920-7-23.
Ruiz JG, Mintzer MJ, Leipzig RM: The impact of E-learning in medical education. Acad Med. 2006, 81 (3): 207-12. 10.1097/00001888-200603000-00002.
Khan KS, Coomarasamy A: A hierarchy of effective teaching and learning to acquire competence in evidenced-based medicine. BMC Med Educ. 2006, 6: 59-10.1186/1472-6920-6-59.
Taylor R, Reeves B, Mears R, Keast J, Binns S, Ewings P: Development and validation of a questionnaire to evaluate the effectiveness of evidence-based practice teaching. BMC Med Educ. 2001, 35 (6): 544-7.
Coppus SF, Emparanza JI, Hadley J, Kulier R, Weinbrenner S, Arvanitis TN: A clinically integrated Curriculum in Evidence-based Medicine for just-in-time learning through on-the-job training: The EU-EBM project. BMC Med Educ. 2007, 7 (1): 46-10.1186/1472-6920-7-46.
Kulier R, Hadley J, Weinbrenner S, Meyerrose B, Decsi T, Horvath AR: Harmonising Evidence-based medicine teaching: a study of the outcomes of e-learning in five European countries. BMC Med Educ. 2008, 8: 27-10.1186/1472-6920-8-27.
Kulier R, Coppus SF, Zamora J, Hadley J, Malick S, Das K, Weinbrenner S: The effectiveness of a clinially integrated e-learning course in evidence-based medicine: A cluster randomized controlled trial. BMC Med Educ. 2009, 9: 21-10.1186/1472-6920-9-21.
Coomarasamy A, Khan KS: What is the evidence that postgraduate teaching in evidence based medicine changes anything? A systematic review. BMJ. 2004, 329 (7473): 1017-10.1136/bmj.329.7473.1017.
Flores-Mateo G, Argimon JM: Evidence based practice in postgraduate healthcare education: a systematic review. BMC Health Serv Res. 2007, 7: 119-10.1186/1472-6963-7-119.
Donner A, Klar N: Cluster randomization trials in epidemiology: Theory and application. J Stat Plann Inference. 2008, 42: 37-56. 10.1016/0378-3758(94)90188-0.
Harden RM, Sowden S, Dunn WR: Educational strategies in curriculum development: the SPICES model. Med Educ. 1984, 18 (4): 284-97. 10.1111/j.1365-2923.1984.tb01024.x.
Fritsche L, Greenhalgh T, Falck-Ytter Y, Neumayer HH, Kunz R: Do short courses in evidence based medicine improve knowledge and skills? Validation of Berlin questionnaire and before and after study of courses in evidence based medicine. BMJ. 2002, 325 (7376): 1338-41. 10.1136/bmj.325.7376.1338.
Ramos KD, Schafer S, Tracz SM: Validation of the Fresno test of competence in evidence based medicine. BMJ. 2003, 326 (7384): 319-21. 10.1136/bmj.326.7384.319.
Holt MC: Development and validation of the Anaesthetic Theatre Educational Environment Measure (ATEEM). Medical Teacher. 2004, 26 (6): 553-8. 10.1080/01421590410001711599.
Roff S: The Dundee Ready Educational Environment Measure (DREEM) - a generic instrument for measuring students' perceptions of undergraduate health professions curricula. Medical Teacher. 2005, 27 (4): 322-5. 10.1080/01421590500151054.
Gulmezoglu AM, Langer A, Piaggio G, Lumbiganon P, Villar J, Grimshaw J: Cluster randomised trial of an active, multifaceted educational intervention based on the WHO Reproductive Health Library to improve obstetric practices. BJOG. 2007, 114 (1): 16-23. 10.1111/j.1471-0528.2006.01091.x.
The authors declare that they have no competing interests.
RK, KSK and AMG had the original idea for the study and drafted the first version of the research protocol. It was then discussed with all other authors in two meetings in Geneva and the correspondent inputs were added to the protocol. All have read and agreed with the final version of the current study protocol.
About this article
Cite this article
Kulier, R., Khan, K.S., Gulmezoglu, A.M. et al. A cluster randomized controlled trial to evaluate the effectiveness of the clinically integrated RHL evidence -based medicine course. Reprod Health 7, 8 (2010) doi:10.1186/1742-4755-7-8
- Objective Structure Clinical Examination
- Educational Environment
- Teaching Unit
- Postgraduate Teaching
- Good Quality Care