Abstract

Aim. To explore students’ perception of theoretical, preclinical, and clinical assessment methods and to analyze their level of satisfaction, with the final goal of getting out with recommendations to improve the weaknesses identified. Material and Methods. A descriptive and transversal survey was carried out by a doctoral student in the Faculty of Dentistry of Casablanca on the perception of students, at the end of their initial training, of learning assessment methods. Results. 51.8% of surveyed students said they were not informed of the criteria to pass successfully the exams, 35.7% of students felt that integrating continuous assessment in addition to the final test would be beneficial for them, and 45.1% of them proposed a frequency of one assessment per month. According to them, this system will allow them to be up to date, to better manage time and knowledge, and to have feedback allowing them to check and improve their skills. Practical activities assessment systems are considered to be adequate by 92% of the surveyed students. Clinical internship assessment focused for the majority of students on the number of procedures. Conclusion. The assessment methods influence students’ learning. It allows teachers to monitor students’ productivity, their attitude, and their work quality, so the teacher can identify gaps in a timely manner at every level of the learning process. According to the students’ perception, the theoretical evaluation should be adapted to the learning objectives, the practical work is quite satisfactory, and the clinical evaluation is mainly based on quantitative criteria.

1. Introduction

Learning evaluation is a fundamental step in an educational process in any university because it allows assessing a student’s acquired knowledge before validating his or her final certification.

It is also the concern of any teacher wishing to ensure that the expected skills have been acquired.

For students, it is the best way to identify their misconceptions, correct them, and guide their learning strategies [1].

Learning challenges are numerous, as well as its tools and procedures [2].

Effective assessment tools of learning should be able to judge students’ progress whatever the field is in a fair and objective way [3, 4].

In medical sciences, in general, and dentistry, in particular, this evaluation mainly covers three aspects: theoretical, practical (or preclinical), and clinical. There are several parameters that come into play in this evaluation. The diversity of trainers and training environments, the nature and complexity of targeted skills at the end of the program, and the high societal expectations of the profession are cited. These factors will impose on evaluators specific requirements [5].

The dental medicine course in Casablanca consists of five years of study and is organized in two cycles. The first two-year cycle is designed to provide students with training in basic medical sciences and preclinical odontological sciences in addition to an introduction to odontotechnics. The 2nd cycle of 3 years is dedicated to the proper odontological training. It is awarded the Doctorate of Dentistry.

The curriculum is delivered in the form of lectures and guided and practical instruction. Clinical internships start in the 4th year and take place in the consultation and treatment dental centers.

The evaluation covers theoretical, practical, and clinical education [6].

Theoretical instruction is assessed through a written exam each semester. Different assessment tools, depending on the teacher’s free choice, are used such as short answer questions, clinical cases, multiple-choice questions, and synthesis questions.

The evaluation of practical teaching is based on the student’s assessment of his or her mastery of the performance of requested acts, and it is formative and summative.

As for clinical teaching, the evaluation focuses on quantitative (number of procedures performed) and qualitative criteria (such as hygiene and asepsis, work field organization, theoretical knowledge, and sign language skills). The tools used can include clinical procedure quotas, clinical case presentations, clinical interviews, and objective and structured clinical examinations.

Unlike teaching quality assessment, particularly in dentistry, literature on the perception of learning assessment methods remains scarce.

It is in this line that a descriptive study was carried out within the Faculty of Dentistry of Casablanca. Its aim is to explore students’ perception of theoretical, practical, and clinical assessment methods and to analyze their level of satisfaction, with the final goal of getting out with recommendations to improve the weaknesses identified.

2. Methods and Tools

Between December 2016 and January 2017, a descriptive and transversal survey was carried out in the Faculty of Dentistry of Casablanca on the perception of students, at the end of their initial training, of learning assessment methods.

A questionnaire has been designed, and it includes 3 major parts:

The first one deals with the overall perception of theoretical assessment, i.e., evaluation methods in general, the most appreciated one and the need to complete with continuous assessment.

The second part focuses on the perception of practical work assessment through the searching values measured by means of evaluation system, the perception of continuous assessments, and the utility of practical work for preclinical internship.

The last part is concerned with the perception of clinical assessment; it focuses on the values measured by the evaluation system and on the perception of the reliability of evaluation methods according to students to measure their performance.

The questionnaires were distributed to all 5th year students and doctoral students of 2016/2017 academic year. Their list was provided by the Department of Student Affairs.

To get in touch with all students, the contact was established with 5th year students on the site of their clinical internships and with doctoral students via the online questionnaire.

A focus group was organized with around 15 students. Its purpose was to refine the results of the survey by knowing the opinions of students about some concepts that are likely to be misunderstood, by collecting subjective data, and by formulating some proposals by the students. Questions were open, and the list of answer choices reported by students was not provided.

Information was entered and analyzed using the SPSS computer software.

3. Results

Among the 248 questionnaires distributed, 80.2% were filled out and returned: 55% of the respondent students were in 5th year, and the others were doctoral students.

3.1. Overall Perception of Theoretical Knowledge Assessment

Communication of Success Criteria. 51.8% of surveyed students said they were not informed of the criteria to pass successfully the exams, while 48.2% responded that they were informed. The results of the students' overall perception of theoretical knowledge assessment are presented in Table 1.

3.2. Perception of Practical Work Assessment

Practical activities assessment systems are considered to be adequate by 92% of the surveyed students. Other parameters of the students’ perceptions of practical work are shown in Table 2.

3.3. Perception of Clinical Internship Assessment

Clinical internship assessment focused for the majority of students on the number of procedures (64.3% in conservative odontology, 71.4% in periodontology, and 71.4% in emergency department) and also on knowledge, quality of procedures, or professionalism. According to our students, the latter is mainly evaluated during the Dentofacial Orthodontics internship (40.2%).

The focus group discussion group believes that professionalism is measured by attendance, behavior towards supervisors, and dress that reflects the physician’s image.

The criteria measured through clinical assessment by service are presented in Table 3.

The percentage of students who answered the question regarding their perception of evaluation methods adopted in clinical placements and of the reliability of information on their performance is 47.2% in Surgical Odontology, 46.7% in Periodontology, and 37.7% in Pedodontics (Table 4).

4. Discussion

There has been a considerable evolution in evaluation practices throughout the world, in order to best meet the quality and professionalism requirements of the future dentist. At the Faculty of Dentistry of Casablanca, methods and criteria to pass the exams are defined by the institution; they have already been codified by law (articles no. 20, 21, and 23 of the decree of February 15, 1993).

4.1. Assessment of Theoretical Knowledge

Despite the fact that it is distributed by the administrative staff and by student representatives directly or via the student guide and website, communication of success criteria is lacking for 51.8% of the surveyed students.

The students who say that they are not informed about the communication of the nature of the tests at the beginning of the teaching sequence represent 54.3%.

These percentages can be partly explained by a high level of absenteeism, noted from the beginning of the year.

According to WHO standards [7], it remains the faculty’s duty to define, describe, and communicate the methods used to evaluate its students.

The surveyed students who prefer clinical case resolution represent 69.3%.

In contrast to the result found in the faculty’s 2008 accreditation (according to the standards of the World Federation for Medical Education (WFME)), the multiple-choice question system was the most appreciated. This observation can be explained by the fact that the study concerned all students, at all levels.

Students at the end of the cycle mainly choose a method that evaluates their clinical reasoning. This type of test is a prime tool for exploring the reasoning process [8].

Although multiple-choice questions are one of the most frequently used assessment methods in dental schools, many of these assessments are still based on questions of low cognitive level (i.e., representing students’ ability to understand and remember) [9].

35.7% of students felt that integrating continuous assessment in addition to the final test would be beneficial for them and 45.1% of them proposed a frequency of one assessment per month. According to them, this system will allow them to be up to date, to better manage time and knowledge, and to have feedback allowing them to check and improve their skills.

This was verified by a survey conducted at the Faculty of Dentistry in India [10], where students who voluntarily enrolled in online formative assessments during the first and second semesters obtained higher scores with statically significant differences.

Another study conducted at the Michigan School of Dentistry showed that among the most common suggestions made by students was that professors are better able to design evaluations in depth along with their students than when they conceive them alone. This has proven to be an effective tool for building critical thinking and improving students' results in comparison with questions designed only by the teachers. This seems extremely useful [11].

In addition, a working group of 14 dental schools [12] addressed the issue of the ideal academic environment as seen by their students, and a number of improvements were suggested, including(i)Establishing clear and carefully planned assessment objectives(ii)Conducting regular, on-demand assessments that stimulate active learning and reduce final exam anxiety(iii)Providing immediate feedback to students

4.2. Practical Work Assessment

The main shortcomings identified are due to the fact that continuous assessment marks are not argued.

Teachers, at some practical works, actually do not have evaluation sheets that can explain the marks given to students. This is why the creation of better elaborated grids is essential. They could include the evaluation criteria that have to be answered: perfect, good, acceptable, and nil.

Diemer in 2005 [13] proposed the addition of photographs and/or videos to these validation criteria.

The evaluation grid in endodontic practical works, together with visual support, also seemed to help Toulouse students to develop skills more quickly in order to obtain a professionally satisfactory act. This was the result of a study carried out between 1999 and 2002 [14].

4.3. Clinical Internship Assessment

According to Diemer et al. clinical internship represents the ideal place for skills acquisition, and they allow the transfer and especially the mobilization of knowledge to ultimately encourage clinical and practical reasoning [13].

According to our students’ perception, almost all departments evaluate first the number of acts and then knowledge, professionalism, or the quality of the acts. This may further increase the stress on stress. This has already been proven in the 2015 study on the value of implementing mentoring for students at the Faculty of Dentistry of Casablanca [15]. The results demonstrated that the transition to grade 4 and 5 is a major source of stress. Among the factors mentioned was quota validation [15, 16].

A qualitative study based on interviews with students in their final year of dental surgery at the University of Toulouse in 2015-2016 confirms the negative effects of quantitative clinical evaluation, such as patient abandonment or modification of the treatment plan. Patient assessment of treatment is considered more humane, less stressful, and more effective. New student- and patient-centred clinical assessment methods must be developed to support student competencies [17].

In the focus group, students felt that professionalism is limited to attendance, behavior, and dress that reflect the physician’s image.

According to Pelaccia [4], five attributes characterize caregivers who demonstrate professionalism:(i)Development and maintenance of skills throughout the career(ii)Relational abilities(iii)Collaborative practice skills(iv)Professional integrity and ethics(v)Partnership with the patient

On the other hand, according to our students, assessment methods provide information on actual performance in the surgical and pedodontic departments in only 47.2% and 37.7%, respectively.

Assessing the quality of clinical procedures must be an integral part of the final mark given to the internship, alongside the quota imposed on students.

In order to reach this goal, the Faculty of Dentistry of Casablanca has implemented since 2010 [18] an online evaluation system inspired by the one applied at the University of Montreal [19]. It is an easy-to-use tool that addresses clinical evaluation issues and provides a quality strategy for evaluating competencies that are considered essential in addition to the number of procedures to be performed.

Students can also benefit from rapid and more effective feedback based on known and well-defined uniform criteria, and they can have better control over their clinical progress.

However, due to lack of feedback, the student does not know how much he or she progresses during the clinical internship, which can lead to a problem in the perception of competency levels between students and teachers. As a result, students may tend to overestimate their skill levels and believe strongly in their performance, which does not always meet the expectations of their supervisors.

However, there are still improvements to do in this system, which does not include, for example, time devoted to perform a given act. It also requires a real involvement and availability of supervisors to be able to fill in the form in real time.

5. Conclusion

Evaluation methods of theoretical, preclinical, and clinical teaching adopted by the Faculty of Dentistry of Casablanca are numerous. No method or tool is good or bad in itself, but it must be fundamentally consistent with pre-established pedagogical choices and objectives.

Through our study, it appears that the majority of end-of-course students believe that the methods of theoretical knowledge evaluation should be modified, that the system of evaluating practical work is perceived as satisfactory for almost all, and that the evaluation of clinical placements focuses mainly on the number of procedures.

We hope to have made useful data available to teachers to improve evaluation methods, in particular by integrating continuous assessment into theoretical evaluation, by explaining practical work marks, and by further evaluating know-how and life skills in clinical internship.

Improving assessment practices requires commitment of both teachers and students. When the latter will see the results, they will be more likely to commit.

Data Availability

The data used to support the findings of this study are included within the article.

Conflicts of Interest

The authors declare that there are no conflicts of interest regarding the publication of this paper.