Skip to main content

A novel collaborative e-learning platform for medical students - ALERT STUDENT

Abstract

Background

The increasing complexity of medical curricula would benefit from adaptive computer supported collaborative learning systems that support study management using instructional design and learning object principles. However, to our knowledge, there are scarce reports regarding applications developed to meet this goal and encompass the complete medical curriculum. The aim of ths study was to develop and assess the usability of an adaptive computer supported collaborative learning system for medical students to manage study sessions.

Results

A study platform named ALERT STUDENT was built as a free web application. Content chunks are represented as Flashcards that hold knowledge and open ended questions. These can be created in a collaborative fashion. Multiple Flashcards can be combined into custom stacks called Notebooks that can be accessed in study Groups that belong to the user institution. The system provides a Study Mode that features text markers, text notes, timers and color-coded content prioritization based on self-assessment of open ended questions presented in a Quiz Mode. Time spent studying and Perception of knowledge are displayed for each student and peers using charts. Computer supported collaborative learning is achieved by allowing for simultaneous creation of Notebooks and self-assessment questions by many users in a pre-defined Group. Past personal performance data is retrieved when studying new Notebooks containing previously studied Flashcards. Self-report surveys showed that students highly agreed that the system was useful and were willing to use it as a reference tool.

Conclusions

The platform employs various instructional design and learning object principles in a computer supported collaborative learning platform for medical students that allows for study management. The application broadens student insight over learning results and supports informed decisions based on past learning performance. It serves as a potential educational model for the medical education setting that has gathered strong positive feedback from students at our school.

This platform provides a case study on how effective blending of instructional design and learning object principles can be brought together to manage study, and takes an important step towards bringing information management tools to support study decisions and improving learning outcomes.

Peer Review reports

Background

Medical education is an area of increasing complexity, considering the education goals of health professionals for the XXI century [1, 2]. Successful medical learning requires a considerable time investment not only in the development of core and specific competencies, but also in the ability to transfer basic cognitive competencies to the clinical setting through the integration of personal experience and vast information sources [1, 3].

Information management regards the ability to search, identify and integrate relevant information that can be further used for critical reasoning in clinical practice [4] and is currently one of the most compelling challenges facing medical students.

Approaches to enhance learning

In many settings, information is not effectively managed during learning. The demanding learning process frequently drives students to retain knowledge to meet course goals instead of strengthening competence development [5]. According to the Adaptive Character of Thought (ACT-R) theory “time on task” is the most important factor for developing lifetime competence [6]. As the amount of knowledge to learn increases, how well time is managed in the learning processes becomes key [6]. Cognitive load theory postulates three types of cognitive load: (a) intrinsic load is the net result of task complexity and the learner expertise; (b) extraneous load is caused by superfluous processes that do not directly contribute to learning; (c) germane load is accounted by learning processes handling intrinsic cognitive load [7]. Studies have been carried to identify design guidelines and benefits of this theory in health sciences education [6, 813].

Spaced-repetition, a learning approach that focuses on reviewing content multiple times over optimized time intervals is one of the most effective ways to improve long-term retention [1418]. While evidence-based principles for instructional design are abundant, they are infrequently incorporated into the educational setting in a consistent and deliberate manner [19].

Learning objects

The way in which content can be organized in order to optimize learning has also been extensively studied [4, 13, 2023]. Learning objects, groupings of instructional materials structured to meet specific educational objectives [23], define a set of guidelines to make content portable, interactive and reusable, [2327] therefore enhancing and tailoring learning [26]. They may facilitate adaptive learning by offering the chunks of content that the learner needs in order to achieve an accepted level of competence.

Other authors have identified the need to simplify the learning object authoring process to gain wider acceptance and use [28]. Additionally, the design of appropriate and effective technologies must take into account individual differences in learning, through systems that adapt based on individual progress and performance or through explicit choices made by the learner [29].

Students need tools to help retain knowledge for longer periods and easily identify materials with lesser retention rates [18]. This goal may be achieved by providing learners with personal insight on their learning effectiveness, using personal and peer progress data based on self-assessment results [26].

Computer supported collaborative learning

Currently, web applications can be a valuable tool to reach information management goals. The application of new learning technologies that has emerged as a main stream in medical education [30] is known to simplify document management, communication, student evaluation and grading [31]. However, these tools focus mainly on maximizing efficiency of administrative teaching and have little in consideration the learning tasks directed at students.

Additionally, over recent years there has been a shift in medical education where traditional instructor-centered teaching is yielding to a learner-centered model [28, 32]. With the advent of social media tools that allow for collaboration and community building it is becoming more common for students to create and share materials on-line [25, 33]. However, these materials are often not validated or reviewed by teachers [34, 35] and may decrease learning effectiveness as the student will need to browse, filter and validate relevant information from numerous and often conflicting information sources [36].

CSCL can add an instructor role to the learner-centered model. It can place learners in control of their own learning and transforms the role of a teacher from the sole-provider of information to a facilitator of knowledge acquisition [28, 35] promoting greater learning satisfaction [17, 37]. This type of approach usually takes place in asynchronous collaboration settings where students and teachers can collaborate at different times [3739]. Despite this potential, little evidence of effectiveness on using such tools in the health professions has been gathered [17, 40].

Effective information management during the learning process may be achieved through adoption of computer supported collaborative learning (CSCL) systems that provide validated content in the form of learning objects, allow student self-assessment and display tailored feedback that can be used to support study management. This data should direct further exploratory or limited learning approaches, so that knowledge acquisition may be benefited at the same time information management competences are developed.

The present study aims to develop and assess the usability of an adaptive CSCL system that helps making decisions regarding personal learning process. So far, existing studies regarding such systems were built to be applied in specific medical knowledge fields [8, 4144]. To our knowledge no system has been built to be of application to medical curricula in general [45].

Implementation

Technologies

The present application was built in accordance to current web standards. The user interface was built using HyperText Markup Language (HTML), Standard Vector Graphics (SVG) and JavaScript. The application layer of the system was built using JAVA technology over the Play!Framework version 1.2. The database layer was built using ORACLE systems. The data model is described using a simplified UML diagram in Figure 1. A simpler version of the application was developed for the iPhone but will not be discussed in this paper.

Figure 1
figure 1

Simplified Entity relationship UML diagram. A simple UML diagram that specifies relationships between the main application objects. Multiple Notebooks belong to a Group, and multiple Groups belong to an institution. An institution has multiple topics and Flashcards. A Notebook may hold multiple topics that are associated to multiple Flashcards. Multiple topics can also belong to a broader topic. A Flashcard can be composed of one or two facts, up to two description items, up to four images and one to eight questions. Multiple questions can be associated to a Fact, Description or Image.

Content structure

Content was required to be stored reusable blocks that would allow building of higher order learning blocks as well as assessing knowledge. Knowledge assessment was carried out using open ended questions. The smallest learning block was named Flashcard, and was composed of information on one side and open ended questions on the other. Each Flashcard contained up to 8 knowledge pieces named Fact, Description and Image. Questions can be associated to each of these pieces individually. Each piece would therefore serve as the answer to one or more questions. Since content re-usability was paramount, a Flashcards categorization system was implemented using Medical Subject Headings (MeSH) from the United States National Library of Medicine.

Aggregation of Flashcards in higher order structures was required to achieve meaningful learning goals. That would require creating custom aggregations of Flashcards of different MeSH topics. Topic and Flashcard order should be arranged according to the learning goal. We named these custom aggregations Notebooks.

In order for students and teachers to create and share content, Groups were created. Groups reside within institutions. Therefore, users from a given institution could access its Groups. A universal institution was created in order to allow all users to create and share content globally.

Learning tools

User information regarding study metrics needed to be collected for study management. Time spent studying and Perception of knowledge were the two identified metrics required to meet this goal (Table 1). Perception of knowledge refers to student self perception of how well knowledge could be recalled when an open-ended question is presented. This data allowed computation of Flashcard study priority levels. These features were collected and presented in different sections: one devoted to study - Study Mode; another devoted to self-assessment - Quiz Mode; and a section devoted to analysis of performance metrics per Notebook - Notebook Dashboard.

Table 1 Variables measured by the system

System usability and adoption surveys

System usability and feature usefulness of the Study Mode, Quiz Mode and Notebook Dashboard was assessed using a group of 48 students from the Faculty of Medicine of the University of Porto (FMUP) and two on-line self-report questionnaires. Students from the 4th and 5th years of the medical course were randomly selected and contacted by email to participate in the study.

The study consisted of 2 classroom sessions (S1, S2) in consecutive weeks, with duration of 1 hour. Each student was provided a computer. The students were instructed to use the Study Mode, Quiz Mode and Notebook Dashboard to study and assess their knowledge on a Notebook about the Golgi Complex. The Notebook was created using pedagogical materials provided by the Department of Cellular and Molecular Biology of FMUP.

During S1 students had 10 minutes to register in the platform. A 2 minute explanation of how the Study Mode, Quiz Mode and Notebook Dashboard worked was given to students before they used the application. All doubts were clarified. The students then spent 20 minutes on Study Mode, 15 minutes on Quiz Mode and 5 minutes on the Notebook Dashboard. After that time the students completed an on-line survey regarding system usability and tool usefulness. Students left the room only after all students completed all tasks.

During S2 students spent equal amounts of time on the Study Mode, Quiz Mode and Notebook Dashboard. At the end of the session, the system usability and tool usefulness survey was filled again and an additional survey regarding willingness to adopt the system as a reference tool was also completed.

The 3 surveys consisted of a set of objective statements regarding personal experience. Student agreement to each of the items was assessed using a 4-point likert scale: 1 - full disagreement; 2 - partial disagreement; 3 - partial agreement; 4 - full agreement.

Paired sample t-test was used to compare differences in the system usability and tool usefulness survey answers between the two sessions. Significance level was fixed at 0.05.

This study was approved by the Faculty of Medicine University of Porto/São João Hospital Ethics Committee in compliance with the Helsinki Declaration.

Results and discussion

The platform was implemented as a free web application named ALERT STUDENT. Table 2 provides an outline of how learning objects principles were implemented in the system and Table 3 provides detail on how several instructional design features were implemented.

Table 2 Implementation of learning object principles
Table 3 Implementation of instructional design principles

Groups

The application has a section devoted to Groups (Figure 2). This section consists of a page listing all Groups and specific Group pages. The list page allows browsing Groups using search by name, tags and filtering by belonging institution. The Group page was divided into 4 sections: (a) Group wall for posting and commenting; (b) member’s page where Group administrators can manage members; (c) Notebook page that holds Notebooks and allows creation or editing; (d) Group profile section where non-members can see the Group summary.

Figure 2
figure 2

User Groups screen. A list of Groups for a given user is displayed.

Groups allow a closed environment approach where students can interact with a defined set of users and content for a given learning goal. This is similar to the wiki or blog scenario where administrators limit registration and editing privileges to selected users [25]. Allowing Flashcards within a Group to be available to other Groups of the same institution facilitates content sharing within the institution. This helps to reduce content redundancy, allows faster content creation and allows new Notebooks to be created using previously studied Flashcards. This may lessen intrinsic cognitive load by reducing the exploratory component involved in learning new redundant materials, hence increasing learning performance [31].

Notebooks

Notebooks can be accessed through Group pages or through a global Notebook page. Both pages provide search and filter features. (Figure 3) The Notebook Dashboard shows overall information and study statistics regarding personal study performance. Users can analyze Flashcard size and Time spent studying using a sunburst chart (Figure 4). A toggle button resizes each Flashcard representation to match either its character count or time taken. A bar chart plots Perception of knowledge per topic in two series. One series plots user Perception of knowledge while another plots mean peer Perception of knowledge. A line chart plots Perception of knowledge per quiz session in two series as well. One series plots user Perception of knowledge while another plots mean peer Perception of knowledge (Figure 4). The Notebook editor allows simultaneous creation of Notebooks by searching and selection topics and Flashcards available to be part of a Notebook. New topics and Flashcards can be created as well. A graph of MeSH topic relationships is also displayed and can be used to browse topics (Figure 5).

Figure 3
figure 3

User Notebooks . A list of the Notebooks for a given user is displayed.

Figure 4
figure 4

Notebook Dashboard. The sunburst chart represents the topic and Flashcard distribution. The toggle button switches the configuration between Flashcard size (given by the number of characters) and Time spent studying on a Notebook. The bar chart on the left depicts Perception of knowledge per topic, for the user and its peers. The line chart on the right is represents Perception of knowledge per quiz session for the user and its peers.

Figure 5
figure 5

Notebook editor. Topics can be browsed on the left column on the search tab. Checked topics become part of the Notebook and become available on the notebook tab. The center column displays Flashcards for the selected topic. Checked Flashcards become part of the Notebook. New Flashcards can be created on any topic. On the right MeSH relationships between topics are represented using a graph that can be used to navigate topics.

Flashcards allow content to be created in ways that match specific learning goals and can be reused with little effort to match other learning requirements. Though they are in accordance to the learning objects principles of stand-alone, reusability, interactivity and aggregation [23] (Table 2), the amount of context to build these type of learning objects must be balanced in a way that allows isolated usage in different settings as well as chaining with additional Flashcards in meaningful ways [26]. Enclosing little context in each Flashcard may lead to less articulated Notebooks.

Flashcards are supported by the cognitive load theory. Small chunks of self-enclosed knowledge decrease intrinsic cognitive load. Additionally, since Notebooks are combinations of Flashcards, they can orient learning in a simple-to-complex strategy that further decreases intrinsic cognitive load [6, 9, 47]. Furthermore, this process can be extended by refactoring multiple Notebooks into smaller summary Notebooks containing the most relevant Flashcards that leverage the same cognitive load principles further [47]. Performance data for overlapping Flashcards can be used to optimize study sessions in a new Notebook setting, which also applies to the principles of learning object re-usability, interactivity and aggregation [47] (Table 2).

The charts allow the student to take action on their study sessions based on Time spent studying and personal and peer Perception of knowledge. Previous works have shown that feedback play a key role in determining learning success [26], hence, insight into performance metrics may help build motivation to learn further.

Study Mode

The Study Mode allows Notebook study in an adequate digital environment, which minimizes sources of distraction (Figure 6). The dark colors used on the interface contrast with the white Flashcards, creating focus on the area of interest. The center displays the Flashcards stacked as a continuous piece of text. On the side, the index of topics is displayed. It also provides study progress metrics such as percentage of Flashcards studied, number of study sessions, time taken per session, total Time spent studying and Time spent studying on the previous session. Flashcards can be flipped one at a time or altogether to reveal the questions. Flashcards have a button to increment Time spent studying and can be removed from the Quiz Mode assessment by folding the top left corner with a simple click. Additionally, Flashcards have a colored bar on the side that expresses Perception of knowledge. All tool menus are collapsible to prevent distractions. Available tools include filters for Flashcard priority and category, a timer, a stopwatch, notes and text highlighters. Other tools present the keyboard shortcut guide and allow exporting the Notebook in.pdf format.

Figure 6
figure 6

Study Mode . The left column with circles represent the Notebook topic index. The blue circle represents the topic currently displayed. The top bar houses the content filters and progress status. Timers are also available but not shown. The bar in the right side is the actions bar, that houses Flashcard flipping, text marker, filter and timer toggle, pause mode, keyboard shortcuts list, print view and shortcut to statistics buttons. The third Flashcard displayed is flipped, showing questions and an answer.

In order to increase reading speed, comprehension, and reduce fatigue from screen reading, spaced lines with a mean of 70 characters in length and large window height were used as mentioned in previous studies [48, 49]. The ability to hide tools and the keyboard shortcuts further improves focus. Flashcard category and priority filters allow learning sessions to be tailored to personal goals effectively. These features may help reduce extraneous cognitive load related to content navigation tasks and interface visual noise [11, 47]. Flipping the Flashcard column provides a tailored “content-and-question” oriented study environment. The ability to resume study sessions from the point that they were last left, further reduces extraneous cognitive load by decreasing distance to the required point of focus [11, 47].

Quiz Mode

The Quiz Mode is the section devoted to self-assessment (Figure 7). It takes the Flashcards of a Notebook, and selects a set of Flashcard questions that are presented one at a time. For each question the user should recall the required knowledge. Afterwards the user reveals the Flashcard section that answers the question and grades Perception of knowledge, the quality of the user recall, using a 4-point likert scale. After grading Perception of knowledge, the system shows another question. The student also has the option of reporting the Flashcard to the Group administrators when inaccuracies are found. After the evaluation step, another card is shown. The system displays student progress and the number of questions rated per grade. When the user finishes the Quiz, statistics about the Time spent studying on each session are presented. The student can also review the Flashcards for the questions with the lowest Perception of knowledge. Questions are chosen so that all flashcard elements are assessed. If more than one question is available for a given content piece, then the system will chose either the hardest question if there are previous ratings, or will pick a question at random. Global Perception of knowledge for each Flashcard is computed by calculating a weighted average of the last three sessions Flashcard Perception of knowledge. The session Perception of knowledge for a Flashcard is calculated by averaging the results for every question answered for the Flashcard in that session.

Figure 7
figure 7

Quiz Mode . A question card is represented along with the answer. Perception of knowledge is graded using the set of four buttons shown. The rightmost button reporting of errors to the Notebook owner. The column on the right tracks student progress.

The Quiz Mode is essential for the system to compute Perception of knowledge. Because each Flashcard may have multiple questions regarding the same content piece, the Quiz Mode is able to use the questions with lowest Perception of knowledge. This provides a means to assess knowledge using questions that are most difficult thereby tailoring memory retention needs. This is also in accordance to the intrinsic cognitive load strategy of low-to-high fidelity tasks because as the student progress, questions representing harder tasks will be preferentially selected [47]. Spaced repetition promotes development strengthening of long-term memory schemata acquired during previous contacts with the Flashcards. This will reduce the amount of elements that will be dealt with using working memory, thus reducing cognitive load and allowing additional focus on the recall process [47]. The way the user grades Perception of knowledge is, however, subject to affective factors. Users may feel inclined to overrate their Perception of knowledge thus decreasing the beneficial effect of the system [50]. Although self-assessment questions are demonstrated to positively affect learning outcomes [16, 19, 5053], it remains unknown whether self-reported evaluations correlate with exam grades. This question system has as primary goal to allow self-assessment of simple recall questions. Integrated reasoning questions that require integration of multiple pieces of knowledge are a second and more important step, that the authors intend to develop in the future.

This system implements other features, such as a content repository for FMUP students, the ability to present the Notebooks using full screen Flashcards and, a picture gallery, however these are not presented as their purposes are distinct from the goals of this work.

System usability and adoption surveys

The student participation rate was 100% as all of the 48 students randomized to take part in this work accepted to participate. All students completed the two sessions. The score for all items on the survey regarding system usability and tool usefulness (Tables 4 and 5) approached 3.5 (partial to full agreement) in both sessions and overall there were no significant differences between sessions. Both surveys have shown that students generally agreed that the tools provided were useful and simple and were willing to use them as a privileged element for their medical education.

Table 4 System usability and tool usefulness survey
Table 5 Willingness to adopt the system as a reference tool

Conclusions

Overall the application brings a new set of tools that may be helpful to organize knowledge in meaningful ways as well as to manage study sessions, based on personal performance metrics. The system takes into consideration learning object design, instructional design guidelines and principles from cognitive learning theories. Specifically the system allows students to: (1) create personal and reusable learning materials in a collaborative on-line environment (2) self-assess their knowledge through spaced repetition of open ended questions (3) view detailed feedback on their performance and progress (4) easily use the feedback for deliberate practice and to tailor future learning experiences.

Assessment of student performance on content presented through this system and direct comparison of learning outcomes against other learning tools and methods are the aims of future work. The development of these features is an important step towards bringing information management tools to support study decisions and improving learning outcomes.

Availability and requirements

Project name: ALERT STUDENTProject home page: http://www.alert-student.com Operating systems: Platform independentProgramming languages: HTML, CSS, Javascript, Java, Oracle SQLOther requirements: Internet explorer 8+, Firefox, Google Chrome, SafariLicense: Not opensourceAny restrictions to use by non-academics: No restrictions

Abbreviations

ACT-R:

Adaptive character of thought

CSCL:

Computer supported collaborative learning

FMUP:

Faculty of Medicine of the University of Porto

HTML:

HyperText Markup Language

MeSH:

Medical subjects headings

SVG:

Standard vector graphics

S1:

Study session 1

S2:

Study session 2.

References

  1. Frenk J, Chen L, Bhutta ZA, Cohen J, Crisp N, Evans T, Fineberg H, Garcia P, Ke Y, Kelley P, Kistnasamy B, Meleis A, Naylor D, Pablos-Mendez A, Reddy S, Scrimshaw S, Sepulveda J, Serwadda D, Zurayk H: Health professionals for a new century: transforming education to strengthen health systems in an interdependent world. Lancet. 2010, 376 (9756): 1923-1958. [http://www.ncbi.nlm.nih.gov/pubmed/21112623]

    Article  Google Scholar 

  2. Horton R: A new epoch for health professionals’ education. Lancet. 2010, 376 (9756): 1875-1877. [http://www.ncbi.nlm.nih.gov/pubmed/21112621]

    Article  Google Scholar 

  3. Patel VL, Cytryn KN, Shortliffe EH, Safran C: The collaborative health care team: the role of individual and group expertise. Teach Learn Med. 2000, 12 (3): 117-132. [http://www.ncbi.nlm.nih.gov/pubmed/11228898]

    Article  Google Scholar 

  4. Schwarz MR, Wojtczak A: Global minimum essential requirements: a road towards competence-oriented medical education. Med Teach. 2002, 24 (2): 125-129. [http://www.ncbi.nlm.nih.gov/pubmed/12098430]

    Article  Google Scholar 

  5. Kerfoot BP, Baker H, Jackson TL, Hulbert WC, Federman DD, Oates RD, DeWolf WC: A multi-institutional randomized controlled trial of adjuvant Web-based teaching to medical students. Acad Med. 2006, 81 (3): 224-230. [http://www.ncbi.nlm.nih.gov/pubmed/16501262]

    Article  Google Scholar 

  6. Patel VL, Yoskowitz NA, Arocha JF, Shortliffe EH: Cognitive and learning sciences in biomedical and health instructional design: a review with lessons for biomedical informatics education. J Biomed Inform. 2009, 42: 176-197. [http://dx.doi.org/10.1016/j.jbi.2008.12.002], [http://www.ncbi.nlm.nih.gov/pubmed/19135173]

    Article  Google Scholar 

  7. Sweller J, van Merrienboer JJ, Paas FG: Cognitive architecture and instructional design. Educ Psychol Rev. 1998, 10 (3): 251-296. [http://doc.utwente.nl/58655/]

    Article  Google Scholar 

  8. Morgulis Y, Kumar RK, Lindeman R, Velan GM: Impact on learning of an e-learning module on leukaemia: a randomised controlled trial. BMC Med Educ. 2012, 12: 36-[http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=3419126&tool=pmcentrez&rendertype=abstract]

    Article  Google Scholar 

  9. Dror I, Schmidt P, O’connor L: A cognitive perspective on technology enhanced learning in medical training: great opportunities, pitfalls and challenges. Med Teach. 2011, 33 (4): 291-296. [http://www.ncbi.nlm.nih.gov/pubmed/21456986]

    Article  Google Scholar 

  10. van Merriënboer JJG, Sweller J: Cognitive load theory in health professional education: design principles and strategies. Med Educ. 2010, 44: 85-93. [http://www.ncbi.nlm.nih.gov/pubmed/20078759]

    Article  Google Scholar 

  11. Mayer RE: Applying the science of learning to medical education. Med Educ. 2010, 44 (6): 543-549. [http://www.ncbi.nlm.nih.gov/pubmed/20604850]

    Article  Google Scholar 

  12. Choules AP: The use of elearning in medical education: a review of the current situation. Postgrad Med J. 2007, 83 (978): 212-216. [http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=2600032&tool=pmcentrez&rendertype=abstract]

    Article  Google Scholar 

  13. Khalil MK, Paas F, Johnson TE, Payer AF: Interactive and dynamic visualizations in teaching and learning of anatomy: a cognitive load perspective. Anat Rec B New Anat. 2005, 286: 8-14. [http://www.ncbi.nlm.nih.gov/pubmed/16177993]

    Article  Google Scholar 

  14. Kerfoot BP, Baker H: An online spaced-education game to teach and assess residents: a multi-institutional prospective trial. J Am Coll Surg. 2012, 214 (3): 367-373. [http://www.ncbi.nlm.nih.gov/pubmed/22225647]

    Article  Google Scholar 

  15. Shaw T, Long A, Chopra S, Kerfoot BP: Impact on clinical behavior of face-to-face continuing medical education blended with online spaced education: a randomized controlled trial. J Contin Educ Health Prof. 2011, 31 (2): 103-108. [http://www.ncbi.nlm.nih.gov/pubmed/21671276]

    Article  Google Scholar 

  16. Kerfoot BP, Fu Y, Baker H, Connelly D, Ritchey ML, Genega EM: Online spaced education generates transfer and improves long-term retention of diagnostic skills: a randomized controlled trial. J Am Coll Surg. 2010, 211 (3): 331-337.e1. [http://www.ncbi.nlm.nih.gov/pubmed/20800189]

    Article  Google Scholar 

  17. Cook Da, Levinson AJ, Garside S, Dupras DM, Erwin PJ, Montori VM: Instructional design variations in internet-based learning for health professions education: a systematic review and meta-analysis. Acad Med. 2010, 85 (5): 909-922. [http://www.ncbi.nlm.nih.gov/pubmed/20520049]

    Article  Google Scholar 

  18. Kerfoot BP, DeWolf WC, Masser BA, Church PA, Federman DD: Spaced education improves the retention of clinical knowledge by medical students: a randomised controlled trial. Med Educ. 2007, 41: 23-31. [http://www.ncbi.nlm.nih.gov/pubmed/17209889]

    Article  Google Scholar 

  19. Trial C, Cook DA, Thompson WG, Thomas KG, Thomas MR, Pankratz VS: Impact of self-assessment questions and learning styles in Web-based learning: a randomized, controlled, crossover trial. Acad Med. 2006, 81 (3): 231-238. [http://www.ncbi.nlm.nih.gov/pubmed/16501263]

    Article  Google Scholar 

  20. Harden RM, Gessner IH, Gunn M, Issenberg SB, Pringle SD, Stewart A: Creating an e-learning module from learning objects using a commentary or ‘personal learning assistant’. Med Teach. 2011, 33 (4): 286-290. [http://www.ncbi.nlm.nih.gov/pubmed/21456985]

    Article  Google Scholar 

  21. Clark R, Mayer R: Applying the segmenting and pretraining principles: managing complexity by breaking a lesson into parts. E-Learning and the Science of Instruction: Proven Guidelines for Consumers and Designers of Multimedia. 2010, San Francisco: Jossey-Bass, 207-218.

    Google Scholar 

  22. Masters K, Ellaway R: e-Learning in medical education Guide 32 Part volume=2, technology, management and design. Med Teach. 2008, 30 (5): 474-489. [http://www.ncbi.nlm.nih.gov/pubmed/18576186]

    Article  Google Scholar 

  23. Ruiz JG, Mintzer MJ, Issenberg SB: Learning objects in medical education. Med Teach. 2006, 28 (7): 599-605. [http://www.ncbi.nlm.nih.gov/pubmed/17594550]

    Article  Google Scholar 

  24. Kim S, Song SM, Yoon YI: Smart learning services based on smart cloud computing. Sensors. 2011, 11 (8): 7835-7850. [http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=3231729&tool=pmcentrez&rendertype=abstract]

    Article  Google Scholar 

  25. Boulos MNK, Maramba I, Wheeler S: Wikis, blogs and podcasts: a new generation of Web-based tools for virtual collaborative clinical practice and education. BMC Med Educ. 2006, 6: 41-[http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=1564136&tool=pmcentrez&rendertype=abstract]

    Article  Google Scholar 

  26. Martinez M: Designing learning objects to personalize learning. PhD thesis, Bloomington IL 2002

  27. Beux PL, Fieschi M: Virtual biomedical universities and e-learning. Int J Med Inform. 2007, 76 (5–6): 331-335. [http://www.ncbi.nlm.nih.gov/pubmed/17407747]

    Article  Google Scholar 

  28. Ruiz JG, Mintzer MJ, Leipzig RM: The impact of E-learning in medical education. Acad Med. 2006, 81 (3): 207-212. [http://www.ncbi.nlm.nih.gov/pubmed/16501260]

    Article  Google Scholar 

  29. Patel VL, Yoskowitz NA, Arocha JF: Towards effective evaluation and reform in medical education: a cognitive and learning sciences perspective. Adv Health Sci Educ Theory Pract. 2009, 14 (5): 791-812.

    Article  Google Scholar 

  30. Harden R: The virtual learning environment in medical education - past, present and future. Medical Education: The State of the Art. Edited by: Salerno-Kennedy R, O’Flynn S. 2010, Hauppauge, NY: Nova Science Publishers Inc., 1-10.

    Google Scholar 

  31. McKendree J: Understanding Medical Education. 2010, Oxford: Wiley-Blackwell, [http://doi.wiley.com/10.1002/9781444320282]

    Google Scholar 

  32. Bahner DP, Adkins E, Patel N, Donley C, Nagel R, Kman NE: How we use social media to supplement a novel curriculum in medical education. Med Teach. 2012, [http://informahealthcare.com/doi/abs/10.3109/0142159X.2012.668245]

    Google Scholar 

  33. Eysenbach G: Medicine 2.0: social networking, collaboration, participation, apomediation, and openness. J Med Internet Res. 2008, 10 (3): e22-[http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=2626430&tool=pmcentrez&rendertype=abstract]

    Article  Google Scholar 

  34. Kind T, Genrich G, Sodhi A, Chretien KC: Social media policies at US medical schools. Med Educ Online. 2010, 15: [http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=2941429&tool=pmcentrez&rendertype=abstract]

    Google Scholar 

  35. Chretien KC, Greysen SR, Chretien JP, Kind T: Online posting of unprofessional content by medical students. JAMA. 2009, 302 (12): 1309-1315. [http://www.ncbi.nlm.nih.gov/pubmed/19773566]

    Article  Google Scholar 

  36. McGrath RG: Exploratory learning, innovative capacity and managerial oversight. Acad Manag J. 2001, 44: 118-131. [http://www.jstor.org/stable/3069340]

    Article  Google Scholar 

  37. Koops W, Van der Vleuten C, De Leng B, Oei SG, Snoeckx L: Computer-supported collaborative learning in the medical workplaceStudents’ experiences on formative peer feedback of a critical appraisal of a topic paper. Med Teach. 2011, 33 (6): e318-e323. [http://www.ncbi.nlm.nih.gov/pubmed/21609168]

    Article  Google Scholar 

  38. Chan CH, Robbins LI: E-Learning systems: promises and pitfalls. Acad Psychiatry. 2006, 30 (6): 491-497. [http://www.ncbi.nlm.nih.gov/pubmed/17139020]

    Article  Google Scholar 

  39. Curran VR, Fleet L: A review of evaluation outcomes of web-based continuing medical education. Med Educ. 2005, 39 (6): 561-567. [http://www.ncbi.nlm.nih.gov/pubmed/15910431]

    Article  Google Scholar 

  40. Paton C, Bamidis PD, Eysenbach G, Hansen M, Cabrer M: Experience in the use of social media in medical and health education. contribution of the IMIA social media working group. Yearb Med Inform. 2011, 6: 21-29. [http://www.ncbi.nlm.nih.gov/pubmed/21938320], [http://repository.usfca.edu/nursing_{f}ac/6]

    Google Scholar 

  41. Hannig A, Kuth N, Özman M, Jonas S, Spreckelsen C: eMedOffice: a web-based collaborative serious game for teaching optimal design of a medical practice. BMC Med Educ. 2012, 12: 104-[http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=3506465&tool=pmcentrez&rendertype=abstract]

    Article  Google Scholar 

  42. Triola MM, Holloway WJ: Enhanced virtual microscopy for collaborative education. BMC Med Educ. 2011, 11: 4-[http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=3037351&tool=pmcentrez&rendertype=abstract]

    Article  Google Scholar 

  43. Al-Jasmi F, Moldovan L, Clarke JTR: Hunter disease eClinicinteractive, computer-assisted, problem-based approach to independent learning about a rare genetic disease. BMC Med Educ. 2010, 10: 72-[http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=2987933&tool=pmcentrez&rendertype=abstract]

    Article  Google Scholar 

  44. Nilsson M, Bolinder G, Held C, Johansson Bl, Fors U, Ostergren J: Evaluation of a web-based ECG-interpretation programme for undergraduate medical students. BMC Med Educ. 2008, 8: 25-[http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=2394519&tool=pmcentrez&rendertype=abstract]

    Article  Google Scholar 

  45. John NW: The impact of Web3D technologies on medical education and training. Comput Educ. 2007, 49: 19-31. [http://linkinghub.elsevier.com/retrieve/pii/S0360131505000825]

    Article  Google Scholar 

  46. Mayer RE: Applying the science of learning to medical education. Med Educ. 2010, 44 (6): 543-549. [http://www.ncbi.nlm.nih.gov/pubmed/20604850]

    Article  Google Scholar 

  47. van Merriënboer JJG, Sweller J, Merrie JJGV: Cognitive load theory in health professional education: design principles and strategies. Med Educ. 2010, 44: 85-93. [http://www.ncbi.nlm.nih.gov/pubmed/20078759]

    Article  Google Scholar 

  48. Dyson M: Exploring the effect of layout on reading from screen. Electronic Publishing, Artistic Imaging, and Digital. 1998, [http://www.springerlink.com/index/K44N3J9L22777K1X.pdf]

    Google Scholar 

  49. Dillon A, McKnight C: Reading from paper versus reading from screen. Comput J. 1988, [http://comjnl.oxfordjournals.org/content/31/5/457.short]

    Google Scholar 

  50. Sitzmann T, Ely K, Brown KG, Bauer KN: Self-assessment of knowledge: a cognitive learning or affective measure?. Acad Manag Learn Educ. 2010, 9 (2): 169-191.

    Article  Google Scholar 

  51. Kerfoot BP, Shaffer K, McMahon GT, Baker H, Kirdar J, Kanter S, Corbett EC, Berkow R, Krupat E, Armstrong EG: Online “spaced education progress-testing" of students to confront two upcoming challenges to medical schools. Acad Med. 2011, 86 (3): 300-306. [http://www.ncbi.nlm.nih.gov/pubmed/21248600]

    Article  Google Scholar 

  52. Kerfoot BP, Brotschi E: Online spaced education to teach urology to medical students: a multi-institutional randomized trial. Am J Surg. 2009, 197: 89-95. [http://www.ncbi.nlm.nih.gov/pubmed/18614145]

    Article  Google Scholar 

  53. Kerfoot BP: Interactive spaced education versus web based modules for teaching urology to medical students: a randomized controlled trial. J Urol. 2008, 179 (6): 2351-2356. discussion 2356–2357 [http://www.ncbi.nlm.nih.gov/pubmed/18423715]

    Article  Google Scholar 

Pre-publication history

Download references

Acknowledgments

The project development was funded by Programa Sistema de Incentivos à Investigação e Desenvolvimento Tecnológico (SI I&DT), projeto no. 6576. The funding source had neither intervention on any phase of the development of the system nor in the writing of this manuscript. We would like to thank the students who took part in the study.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tiago Taveira-Gomes.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors’ contributions

TTG conceived, designed and implemented the system, designed the study and wrote the manuscript. AS conceived and designed the system and wrote the manuscript. MJG oversaw and approved the overall operation for the system development. MS designed the study, performed the statistical analysis and revised the manuscript. MAF oversaw and approved the study design, and revised the manuscript. All authors read and approved the final manuscript.

Authors’ original submitted files for images

Rights and permissions

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly credited. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Taveira-Gomes, T., Saffarzadeh, A., Severo, M. et al. A novel collaborative e-learning platform for medical students - ALERT STUDENT. BMC Med Educ 14, 143 (2014). https://doi.org/10.1186/1472-6920-14-143

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/1472-6920-14-143

Keywords