Skip to main content

Advertisement

We’d like to understand how you use our websites in order to improve them. Register your interest.

Predictive validity of the National Benchmark Test and National Senior Certificate for the academic success of first-year medical students at one South African university

Abstract

Background

South African medical schools use the results of the National Senior Certificate (NSC) examination for selecting students. Five of the nine medical schools also use the National Benchmark Test (NBT). The University of the Witwatersrand weights the NSC and NBT results equally in the selection process. This study addresses the predictive validity of the NBT and NSC for academic success. The association between the NBT proficiency levels and students’ progression outcomes was also investigated.

Methods

Data obtained from the University’s Business Intelligence Services for 1652 first-year medical students from 2011 to 2017 were analysed using hierarchical regression models and chi-square tests. The three NBT domains and four of the NSC subjects were the independent variables in the regression models, with the first-year grade point average for students who passed the first year as the dependant variable. The NBT performance levels and first-year progression outcome (passed, failed, or cancelled) were used in the chi-square analysis. Frequency tables were used to describe the cohort’s demographic details and NBT results. Crosstabs were used to analyse student performance according to the school quintile system.

Results

The three NBT domains explained 26% of the variance, which was statistically significant, R2 = 0.263, F (3, 1232) = 146.78, p < 0.000. When the NSC subjects (Life Sciences, English, Mathematics, and Physical Science) were added to the regression equation, they accounted for an additional 19% of the variance, R2 = 0.188, F (3, 1229) = 137.14, p < 0.000. All independent variables contributed 45% of the variance, R2 = 0.451, F (6, 1229) = 166.29, p < 0.000. A strong association between the NBT proficiency levels and first-year students’ progression outcomes was observed.

Conclusion

The NBT results, when weighted equally to the NSC results, explained more variance than the NSC alone in predicting academic success in the first year of the medical degree. The NBT should not only be used for selecting medical students but should also be used to place students with lower entry-level skills in appropriate foundation programmes and to identify students who are admitted to regular programmes who may need additional support.

Peer Review reports

Background

The first-year performance of medical students is considered to have a profound influence on their future academic progress [1]. A successful first year promotes the development of a positive attitude, self-confidence, and a commitment to their studies [2]. Admission tests are used to identify students with cognitive abilities to cope with the intellectual demands of the medical programme and non-cognitive attributes to assimilate the ethical, inter-relational, and motivational challenges [3, 4]. All nine South African medical schools use the results for selected subjects from the National Senior Certificate (NSC) in their selection process. The NSC is written in the final year (Grade 12 or the matric year) of the Further Education and Training level.Footnote 1 Five medical schools use both the NSC results and the National Benchmark Test (NBT) to select students [3].

The NSC results represent the extent to which a student has met the requirements for grades R–12 [5], based on the final examinations. For each subject, students are allocated rating codes from 1 (not achieved) to 7 (outstanding achievement), which correspond to a range of scores expressed as percentages (see Table 1). Based on their NSC results, students obtain either a Higher Certificate, National Diploma, or a Bachelor Degree pass (see Table 2), which determine the higher education training programmes students are eligible to pursue.

Table 1 NSC rating codes and descriptions [5]
Table 2 NSC outcome descriptors [6]

Higher Education South AfricaFootnote 2 commissioned the NBT in 2005, to complement the NSC results and to provide universities with information about first-year university students’ entry-level skills [7, 8]. The NBTs are written in the final year (grade 12) of the Further Education Training level by prospective university entrants, depending on the admission requirements of the degree they intend studying. Universities use the NBT results to select and place students in programmes for which they qualify. The NBT assesses students in three domains: NBT Mathematics (NBT MAT), NBT Academic Literacy (NBT AL) and NBT Quantitative Literacy (NBT QL). The NBT results provide information about students’ abilities and skills that could predict their future academic performance [9]. The NBT AL tests academic reading and reasoning abilities, aiming to examine students’ abilities to engage successfully with the language demands of higher education [10, 11]. The NBT QL examines students’ abilities to solve quantitative problems (mathematical and statistical) in areas relevant to higher education programmes [11]. The NBT MAT test is designed to assess students’ understanding of mathematical concepts from the school Mathematics curriculum and disciplines such as Mathematics, physics and chemistry [11, 12]. The NBT MAT is written by those students who intend applying for programmes that require Mathematics [13]. Students’ performance in each domain of the NBT is categorised into four levels (proficient, intermediate upper, intermediate lower, and basic), for universities to make selection choices (Table 3).

Table 3 NBT benchmarks set in 2015 for degree purposes [14]

The NSC is norm-referenced while the NBT is criterion-referenced [15]. Norm-referenced tests differ from criterion-referenced tests in their purpose, content selection, and nature of scoring, which influences how the results are interpreted [16]. The NSC is designed to sort and rank students, and the NBT is intended to show students’ performance in clearly defined domains that require mastery [16, 17].

The South African higher education system is afflicted by low throughput rates and increasing failure and dropout rates [11, 18, 19]. The poor university throughput rate could be partially explained by the inequities that persist in the Basic Education system. Several authors, e.g. Prince [15], Walton, Bowman and Osman [20], and Maringa and Osman [21], have described South African Basic Education as effectively consisting of two schooling systems. The two disparate systems are believed to contribute to racialised representation and throughput at university [20,21,22].

South African schools are classified using a quintile system based on socioeconomic indicators, including the average income and unemployment rates in a school’s geographical location [23]. Quintile one is the poorest quintile, quintile two is the second-poorest quintile, and so on [24]. Each quintile encompasses one-fifth of the learners enrolled in public ordinaryFootnote 3 schools [24]. Teachers with the most substantial content knowledge tend to teach in quintile 5 schools [22]. Initially intended to provide funding for schools in lower socio-economic areas, the quintile system introduced by the Department of Education in 2006 is regarded as having reinforced socioeconomic inequalities [23]. The less affluent schools in quintiles 1, 2, and 3, which include many of the former BlackFootnote 4 and Coloured4 schools, tend to produce learners with limited abilities in reading, writing, and numeracy [20, 25], while the schools in quintiles 4 and 5, largely made up of the former White4 and Indian4 schools, produce more university entrants and graduates [20,21,22].

The Faculty of Health Sciences at the University of the Witwatersrand (Wits University) ranks applicants according to a composite index (CI) that weights the NBT and NSC results equally [9]. The four other South African medical schools that use the NBT for selection purposes use NBT weightings of 30 and 40% [3]. The 50% NSC contribution to the CI used at Wits University is derived from applicants’ marks in English, Mathematics, Physical Science or Life Sciences, and the subject with the highest mark of the remaining subjects, except for the subject, Life Orientation. All three of the NBT domains are used to calculate the CI.

In response to government imperatives to produce more healthcare professionals and to reflect the country’s demographics, South African medical schools have adjusted their admission requirements [3]. Currently, 40% of the places available in the Bachelor of Medicine and Bachelor of Surgery (MBBCh) degree at Wits University are reserved for top-performing students. The remaining 60% is divided equally into three categories: top-performing rural students, top-performing students from quintile 1 and 2 schools, and top-performing Black4 and Coloured4 students. Wits University introduced rurality as a selection criterion in 2015.

The teaching programme for the six-year MBBCh degree at Wits University is divided into clinical and pre-clinical years [4]. The medical curriculum is shown in Table 4. The first 2 years focus on the basic sciences [4]. The third and fourth years are the beginning of the clinical years, structured within integrated system-organ blocks [9]. The fifth and sixth years are the clinical years during which students are allocated to clinical clerkships in four academic hospitals [9].

Table 4 Medical curriculum at Wits University

Tutoring programmes are available to all students across all years of study who need help with individual subjects. In addition, the Faculty’s Office of Student Success (OSS) identifies students in need of support through early needs identification and whole class identification. The OSS offers individualised learning skills sessions, peer-tutoring for high-risk subjects, interventions for students from low-resourced school backgrounds, and mental health and wellness.

Given the need to improve the retention and throughput of South African medical students, it is imperative to understand the predictive validity of the NBT and NSC results as selection tools for entry into medical programmes. While many studies have investigated the predictive validity of the NSC [8, 26, 27] and the NBT [7, 9], and the combined predictive validity of the NBT and the NSC [15, 28] for student performance, such research has not been conducted for first-year South African medical students at Wits University. Understanding the link between students’ performance in the NBT and the first year of the MBBCh degree will provide a measure of whether the combination of the NSC and NBT can discriminate between students with the potential to succeed academically and those without, to identify students who will require a foundation programme, and to identify academically at-risk students who may need additional support in the regular MBBCh programme. Focusing on first-year medical students enrolled at Wits University between 2011 and 2017, this study investigated the proportion of the variance in the academic success (passing the first year at the first attempt) explained by the three NBT domains and the four NSC subjects used in the admissions process. The study also sought to identify which of the NBT domains and NSC subjects used for selecting students are significant predictors for academic success. The study further explored the association between the NBT performance levels and students’ first-year progression outcome (passed, failed, or cancelled). Lastly, the link between students’ academic performance and school quintiles was investigated. The school quintiles were used as a proxy for socio-economic status, to understand how the disparities existing in the South African education systems influence students’ academic performance.

Methods

This retrospective study analysed data for 1652 students registered for the MBBCh degree between 2011 and 2017, obtained from the Business Intelligence Services unit at Wits University. The data were analysed using IBM SPSS V25.0.

Frequency tables were used to describe the cohort’s demographic details and NBT performance. Crosstabs were used to understand the differences in first-year students’ progression outcomes (passed, failed, or cancelled) by school quintile.

A hierarchical regression was used to explore the amount of the variance in academic success explained by the three NBT domains while controlling for the four NSC subjects used for selection purposes. The independent variables were the results of three NBT domains and the four NSC subjects. The dependant variable was the first-year grade point average of students who passed. The grade point average is derived from the marks allocated for the first-year subjects in the MBBCh degree (see Table 4).

The hierarchical regression model included 1236 of the 1652 students in the study cohort who met the criteria for inclusion; namely, they had NBT results, NSC results for the required subjects, and had passed the first year. Of the 416 students who were removed from the regression analysis, 10 did not have NBT results, 157 did not have NSC results [Life Sciences (140), Physical Sciences (15) and English (2)], and 234 were students who had either discontinued the first year of study (89) or had failed the year (145). Before interpreting the results, the data were examined for the assumptions of the hierarchical regression (normality, linearity, intercorrelations, homoscedasticity and Mahalanobis distance) [29,30,31] and 15 outliers (students who fell above or below the interquartile range) were removed from the data used in the regression. ANOVA F statistics were used to confirm the predictive utility of the entire model. The R2 coefficients, which are the accurate proportion of the variation in the dependent variable, were interpreted and reported for the two models. The unstandardised (B) and standardised (β) regression coefficient and significance levels for the unique contribution of each predicting variable were examined to identify which variables were significant predictors of academic success in the first year of study. Lastly, the effect size of the model was calculated [32].

Chi-square tests were used to explore the association between students’ NBT performance levels and their first-year progression outcome (passed, failed, or cancelled). Of the 1652 students in the study cohort, ten did not have NBT results and were excluded from the analysis, leaving 1642 cases. Where cells in the contingency table had less than 5% frequency counts, the NBT levels were aggregated to increase the expected counts for the cells. Cell counts of less than 5% violate the assumption of chi-square tests necessary to avoid erroneously interpreting a result as significant (type I error) and incorrectly interpreting a result as not significant (type II error) [33]. The frequencies were raised above the critical 5% frequency by merging the following levels: NBT MAT intermediate lower and basic; NBT AL intermediate upper and intermediate lower; and NBT QL intermediate lower and basic. The Pearson chi-square value and the p-value ≤0.05 were used to assess the statistical significance of the association between each of the NBT domains and the first-year progression outcome. The effect size of the association was calculated and reported using Cohen’s conventions [32].

Ethics approval for the study was obtained from the Human Research Ethics Committee (Medical) of Wits University (Clearance Certificate Number M170490).

Results

Table 5 shows that the students in the study cohort were predominantly female (56%; n = 931), Black4 (44%; n = 732), and of urban origin (78%; n = 1295). More than 80% had attended the better-resourced schools: 74% (1223) were from quintile 5 schools, and 7% (115) had attended quintile 4 schools.

Table 5 Student demographics (N = 1652)

First-year progression outcome for school quintile

Figure 1 shows the first-year performance for the school quintiles. The pass rates for all quintiles were above 60%. Schools in quintile 1 were the worst-performing, with the lowest pass rate of 68.1% (32), the highest failure rate of 19.1% (9) and a cancellation rate of 12.8% (6). More than 80% of the students from schools in quintiles 2 to 5 passed the first year. Quintile 5 schools had the lowest failure rate of 8.6% (105), with a higher cancellation rate, 6.2% (75), than quintile 3 schools, 1.9% (2), and quintile 4 schools, 2.6% (3). The higher cancellation rate for students from quintile 5 schools could be explained by this quintile constituting the majority (74%, 1223) of the 1652 students in the study cohort.

Fig. 1
figure1

First-year progression outcome for school quintile

Table 6 shows the NBT results for the study cohort. Only 54.3% (891) of the students with NBT MAT results were proficient. The NBT MAT domain had the highest number of students who fell outside of the proficient level 45.7% (751), followed by the NBT QL domain, 36.7% (606), and the NBT AL domain 23.8% (380).

Table 6 MBBCh students’ performance level for each NBT domain (2011–2017)

Predictive validity of the NBT and NSC

Table 7 shows the mean and standard deviation for the variables used in the regression. The correlation table in Additional file 1 shows a positive correlation amongst the independent variables of less than 0.7 and that no independent variable correlated with the dependant variable with a correlation coefficient of less than 0.3, as required for regression analyses.

Table 7 Descriptive Statistics (N = 1236)

The NBT domains explained a statistically significant 26% of variance of the first-year grade point average of the students who passed, R2 = 0.263, F (3, 1232) = 146.78, p < 0.000. After controlling for the role of the NBT domains, the four NSC subjects, Life Sciences, English, Mathematics, and Physical Science, explained 19% of the variance, R2 = 0.188, F (3, 1229) = 137.14, p < 0.000. In combination, all predicting variables contributed 45% of the variance in the final marks for the students who passed the first year of study at the first attempt, R2 = 0.451, F (6, 1229) = 166.29, p < 0.000. The unstandardised (B) and standardised (β) regression coefficient and squared semi-partial correlations (sr2) (which denotes the significance of each variable) for the unique contribution of each predicting variable are reported in Table 8. The effect size of this model was (f2 = 0.82).

Table 8 Significance of each predictor to first-year academic success

The lack of significance of the NBT MAT domain could be attributed to the overlap between the content assessed in this domain and the mathematics content taught in Grade 12, as required by the Curriculum and Assessment Policy Statement for Mathematics [34].

Association between the NBT levels and student performance

A Pearson’s chi-square test between the NBT MAT proficiency levels and the first-year progression outcome was statistically significant, χ2 (4, N = 1642) = 184.513, p = 0.000. The association between the NBT MAT and first-year progression outcome was small, φ = 0.23. Figure 2 shows that students who were proficient in the NBT MAT were more likely to pass the first year compared to students admitted with results at the intermediate upper and intermediate lower levels.

Fig. 2
figure2

Associations between NBT MAT levels and first-year progression outcomes

The intermediate upper and intermediate lower levels were aggregated to increase the counts for the NBT AL levels. A Pearson’s chi-square test between the NBT AL proficiency levels and the first-year overall outcome was statistically significant, χ2 (2, N = 1642) = 11.994, p = 0.002. The association between the NBT AL and first-year progression outcome was small, φ = 0.08. As shown in Fig. 3, students who were proficient in the NBT AL domain performed better than students who obtained results in both intermediate levels.

Fig. 3
figure3

Associations between NBT AL performance levels and first-year progression outcomes

In the NBT QL domain, the intermediate lower and basic levels were combined to provide frequencies greater than 5%. A Pearson’s chi-square test between the NBT QL proficiency levels and first-year results was statistically significant, χ2 (4, N = 1642) = 83.433, p = 0.002. The association between the NBT QL and first-year progression outcome was small, φ = 0.159. Despite the small effect size, students who were proficient performed academically better compared to students in either the intermediate upper or the intermediate lower level, as shown in Fig. 4.

Fig. 4
figure4

Associations between NBT QL and first-year progression outcomes

Discussion

Of the seven variables tested in the regression, six were statistically significant in predicting academic success in the first year of study, the four NSC subjects and the two of the three NBT domains, the NBT QL and NBT AL. The NBT MAT domain was not statistically significant in this study, raising the question of whether Wits University and other universities using the NBT for selection purposes, need to reconsider the weighting of the NBT MAT domain. The NBT accounted for a greater amount of the variance in predicting academic success in the first year of study than the NSC. This finding, together with the strong association between the NBT proficiency levels and the first-year progression outcome that was observed, suggests that the NBT should be used together with the NSC to strengthen the admissions process. The student performance by school quintile showed that students who attended lower quintile schools were more likely to experience academic difficulties than those who attended schools in more affluent communities. This raises questions about whether adequate support was provided to the large number of students admitted with low entry-level skills.

Other authors have advocated that an additional selection test is used to complement the NSC. For example, Wadee and Cliff [4] advocated the use of more than one selection tool after they found that the NSC subjects correlate weakly with the first-year academic performance of medical students at Wits University. The NSC subjects were also found to be poor predictors of academic success in the first year of study in a Bachelor of Optometry programme [26]. After tracking the results of a cohort of students from three faculties over 6 years (2009–2014) at another South African higher-education institution, Prince [11] called for the combined use of the NBT with the NSC to improve student placement and retention. Our findings provide empirical evidence supporting the combined use of the NBT and NSC results for the selection of medical students. In our study, the NSC only accounted for 19% of the variance in predicting academic success in the first year of the medical programme. The greater variance (26%) explained by the NBT supports its use as an additional tool to complement the NSC as a predictor of academic success in the first year of study. However, only five of the nine medical schools in South Africa use a combination of the two tools, with other medical schools weighting the NBT lower than 50%.

In addition to the capacity of the NBT to assess students’ abilities to engage with tertiary-level education, the NBT allows the identification of students who may need to be placed into suitable foundation programmes, based on academic skills in which they lack proficiency, and for identifying those admitted to the regular MBBCh programme who may need additional support. The norm-referenced basis of the NSC results, by contrast, offer limited potential for identifying students in medical programmes who need academic support [4, 11]. We observed a strong association between the proficiency levels of the three NBT domains and students’ first-year progression outcomes. Despite the low effect sizes for the associations, students who were proficient in the NBT domains were more likely to perform well academically, raising the question of what support is available for students who are admitted with low entry-level skills. The recent adjustments in university admission policies to include more students from lower quintile schools and rural areas [3] have increased the proportion of students with low entry-level skills [9]. The disparities between students’ entry-level skills and university admission requirements could be attributed to the stark differences between lower-quintile and higher-quintile schools [20,21,22], which reflect the socio-economic differences in the country. While it is in the interests of social justice to admit students who reflect the demographics of the country [3], our results suggest that there is a higher probability that students with low entry-level skills need adequate support to cope with the academic demands of medical education.

Foundation programmes play a crucial role in addressing the learning disparities existing in secondary education and also in promoting students’ chances of continued academic success [35]. Some authors [11, 20, 36] have advocated for the use of foundation programmes to address historical disadvantages in education and socio-economic background in countries like Australia, the United Kingdom, and South Africa. Students placed in foundation programmes are more likely to succeed in the first year of study [11, 36, 37]. Most South African higher-education students exceed the minimum time allowed to complete their studies [11, 19]. Foundation programmes could improve the first-year success rate, which, ultimately, will improve retention and throughput rates. The academic demands of the medical programme may be a huge adjustment for students with low entry-level capabilities [11]. Based on their NBT results, a large proportion of students in this study who failed the first year at their first attempt could have benefitted from additional support, possibly in the form of a foundation programme. Nearly 31% of the students from quintile 1 (19.1%; 9) and quintile 2 schools (11.5%; 10) failed, and close to 20% from quintile 1 (12.8%; 6) and quintile 2 (6.9%; 6) cancelled their studies. The NBT recommends additional support for students accepted with results at the upper intermediate level, placement in foundation programmes for students with results at the intermediate lower level, and placement in other programmes for students with results at the basic level. More than 45.7% of the study cohort were accepted with NBT results that fell outside of the proficient level for the NBT MAT domain alone, while 0.9% of the students were accepted with NBT MAT results at the basic level. As access to medical education in South Africa is widened to accommodate students from different backgrounds and with different entry-level skills, foundation programmes could be useful to improve retention and throughput [36]. In 2016, van der Merwe et al. [3] reported that five of the nine South African medical schools offered foundation programmes, but perhaps other medical schools need to consider this.

The NBT and NSC investigated in this study only accounted for 45% of the variance in the academic success of the first-year students. This leaves 55% of the variance that could be attributed to other factors that have an impact on students’ academic performance. For example, Ahmady et al. [38] found that students’ learning styles, level of motivation, and being first-generation students influenced their performance. Arulampalam [39] reported on the influence of the location of students’ accommodation on their academic achievement, while Yorke [40] explored the impact of their career choice, financial difficulties encountered at home and while at university, and the strategies used in medical teaching. Another factor influencing student academic success is the types of support available. We have reported on the types of support available to medical students at Wits University, but we have not reported on what support students made use of and to what extent.

The myriad factors impacting on student performance in medical degrees, especially the widening of access to include more students from remote areas and students from disadvantaged backgrounds, provide opportunities for further study, both globally and locally, to promote equitable access that fosters fair chances for success. At a local level, areas for future research include investigating the relationship between students’ scores for different NBT domains and their academic performance, the impact of different types of support on student achievement, and the impact of foundation programmes on student retention and throughput.

Conclusion

These results suggest that the NBT, when weighted equally with the NSC results, explains more variance than the NSC results alone in predicting students’ academic success in the first year of medical study. Based on the stronger predictive validity of the NBT, South African medical schools should implement the NBT as an additional selection tool to the NSC, with an equivalent weighting. We further recommend that the NBT, which was commissioned to assess students’ readiness for university, should not be used only as a selection tool for admissions, but also to identify students likely to require additional support, possibly through foundation programmes. Our findings may be applicable to other South African medical schools and other health science professional programmes.

Availability of data and materials

The datasets used and analysed during the current study are available from the corresponding author on reasonable request.

Notes

  1. 1.

    The South African Education system is divided into the Basic Education and Higher Education systems. The Basic Education system consists of the General Education Training level (grades R-9) and the Further Education Training level (grades 10–12). Students write the NSC examination at the end of the grade 12 year of the Further Education Training level.

  2. 2.

    Higher Education South Africa, an organisation representing South Africa’s universities, became ‘Universities South Africa’ in 2015.

  3. 3.

    The term ‘ordinary public’ distinguishes government schools classified according to the quintile system from Schools of Specialisation that offer specialised curricula and are excluded from the quintile system.

  4. 4.

    The racial groupings of Black, Coloured (mixed race), and Indian, were introduced in South Africa during the apartheid era (1947–1994) according to the Population Registration Act (No. 30 of 1950). The terms are still in use to address the racial inequities in the country. Chinese people were classified as either Coloured or Indian. Where relevant, Chinese people are reported as a separate group in this paper.

Abbreviations

NSC:

National Senior Certificate

NBT:

National Benchmark test

NBT MAT:

National Benchmark Test in Mathematics

NBT AL:

National Benchmark in Test Academic Literacy

NBT QL:

National Benchmark Test in Quantitative Literacy

Wits University:

University of the Witwatersrand

OSS:

Office of Student Success

MBBCh:

Bachelor of Medicine and Bachelor of Surgery

References

  1. 1.

    Zipp GP, Ruscingno G, Olson V. Admission variables and academic success in the first year of the professional phase in a doctor of physical therapy program. J Allied Health. 2010;39:138–42.

  2. 2.

    Pascarella E, Terenzini PT. How college affects students: a third decade of research. San Francisco: Wiley Imprint; 2005.

  3. 3.

    van der Merwe LJ, van Zyl GJ, Gibson ASC, Viljoen M, Iputo JE, Mammen M, et al. South African medical schools: current state of selection criteria and medical students’ demographic profile. South African Med J. 2016;106:76–81. https://doi.org/10.7196/SAMJ.2016.v106i1.991.

  4. 4.

    Wadee AA, Cliff A. Pre-admission tests of learning potential as predictors of academic success of first-year medical students. South African J High Educ. 2016;30:264–78. https://doi.org/10.20853/30-2-619.

  5. 5.

    Republic of South Africa Department of Basic Education. National Policy Pertaining To the Programme and Promotion Requirements of the National Curriculum Statement grades R - 12. Gov Gaz No 39435 dated 20 Novmber 2015 Republic of South Africa; 2015. p. 74.

  6. 6.

    Republic of South Africa Department of Education. Minimum Admission Requirements for Higher Certificate, Diploma and Bachelor’s Degree Programmes requiring a National Senior Certificate (NSC). Gov Gazette, No 31231, 11 July 2008; 2008. p. 3–12.

  7. 7.

    Frith V, Prince R. The National Benchmark Quantitative Literacy Test for applicants to south African higher education. Numeracy. 2018;11:1–27.

  8. 8.

    Marnewick C. The mystery of student selection: are there any selection criteria? Educ Stud. 2012;38:123–37. https://doi.org/10.1080/03055698.2011.567041.

  9. 9.

    Mabizela S, Green-Thompson L. Exploring the association of the National Benchmark Test results with the academic performance of medical students who completed the degree in minimum time. J Educ. 2019:44–55. https://doi.org/10.17159/2520-9868/i75a04.

  10. 10.

    Cliff A. The national benchmark test in academic literacy: how might it be used to support teaching in higher education? Lang Matters. 2015;46:3–21. https://doi.org/10.1080/10228195.2015.1027505.

  11. 11.

    Prince R. Predicting success in higher education: the value of criterion and norm-referenced assessments. Pract Res High Educ. 2016;10:22–38.

  12. 12.

    Mahlobo R. National benchmark test as a benchmark tool. In: Mogari L, editor. Pap present 2015 ISTE Int Conf math Sci Technol 25–29 Oct. Limpopo: South Africa; 2015. p. 261–73.

  13. 13.

    Balarin E, Bohlmann CP, Chafimba P, Dunlop J, Le Roux N, Mutakwa D, et al. NBTP National Report: 2017 intake cycle. Cape Town; 2017.

  14. 14.

    Prince R, Balarin E, Steyn S, Nel B, Padayachee P, Mutakwa D, et al. The National Benchmark Tests: 2018 intake cycle. Cape Town; 2018.

  15. 15.

    Prince R. The relationship between school-leaving examinations and university entrance assessments: the case of the South African system. J Educ. 2017;70:133–60.

  16. 16.

    Bond LA. Norm- and criterion-referenced testing. Pract Assessment, Res Eval. 1996;5:3–5.

  17. 17.

    Foxcroft C, Roodt G. Introduction to psychological assessment in the south African context. Cape Town: Oxford University Press Southern Africa; 2013.

  18. 18.

    Directorate: Higher Education Management Information Systems. 2000 to 2014 First time entering undergraduate cohort studies for public higher education institutions. Pretoria: Department of Higher Education and Training; 2017. p. 135.

  19. 19.

    Letseka M, Maile S. High university drop-out rates: a threat to South Africa’s future. Pretoria: Hum Sci Res Counc; 2008. http://www.hsrc.ac.za/uploads/pageContent/1088/Dropout rates.pdf.

  20. 20.

    Walton R, Bowman B, Osman R. Promoting access to higher education in an unequal society. South African J High Educ. 2015;29:262–9.

  21. 21.

    Maringe F, Osman R. Transforming the post-school sector in South Africa: limits of a skills-driven agenda. South African J High Educ. 2016;30:120–40. https://doi.org/10.20853/30-5-616.

  22. 22.

    Venkat H, Spaull N. What do we know about primary teachers’ mathematical content knowledge in South Africa? An analysis of SACMEQ 2007. Int J Educ Dev. 2015;41:121–30. https://doi.org/10.1016/j.ijedudev.2015.02.002.

  23. 23.

    Ogbonnaya UI, Awuah FK. Quintile ranking of schools in South Africa and learners’ achievement in probability. Stat Educ Res J. 2019;18:106–19.

  24. 24.

    Department of Education. Amended National Norms and Standards for School Funding, South African Schools Act, 1996 (Act No 84 of 1996). Gov Gazette, 31 August 2006 Republic of South Africa; 2006. p. 3–55.

  25. 25.

    Taylor N, van der Berg S, Mabogoane T, editors. What makes schools effective? Report of South Africa’s National School Effectiveness Study. Cape Town: Pearson Education; 2012.

  26. 26.

    Mashige KP, Rampersad N, Venkatas IS. Do National Senior Certificate results predict first- year optometry students’ academic performance at university? South African J High Educ. 2014;28:550–63.

  27. 27.

    Burger A, Naudé L. Predictors of academic success in the entry and integration stages of students’ academic careers. Soc Psychol Educ. 2019;22:743–55. https://doi.org/10.1007/s11218-019-09497-3.

  28. 28.

    Rankin N, Schoer V, Sebastiao C, van Wabeek C. Predictors of academic performance: National Senior Certificate versus National Benchmark Test. South African J High Educ. 2012;26:564–85.

  29. 29.

    Ghorbani H. Mahalanobis distance and its application for detecting multivariate outliers. Facta Univ Ser Math Inform. 2019;34:583–95. https://doi.org/10.22190/FUMI1903583G.

  30. 30.

    Williams MN, Grajales CAG, Kurkiewicz D. Assumptions of multiple regression: correcting two misconceptions. Pract Assessment, Res Eval. 2013;18:1–14. https://doi.org/10.7275/55hn-wk47.

  31. 31.

    Osborne JW, Waters E. Four assumptions of multiple regression that researchers should always test. Pract Assessment, Res Eval. 2003;8:2002–3. https://doi.org/10.7275/r222-hv23.

  32. 32.

    Cohen J. Statistical power analysis for the behavioral sciences. Second edi. Hillsdale: Lawrence Erlbaum Associates Publishers; 1998.

  33. 33.

    McHugh ML. The chi-square test of independence. Biochem Medica. 2013;23:143–9. https://doi.org/10.11613/BM.2013.018.

  34. 34.

    Department of Basic education. National Curriculum Statement (NCS) Curriculum and Assessment Policy Statement Further Education and Training Phase Grades 10–12 Mathematics. Pretoria, South Africa: Department of Basic Education; 2011. p. 27–34.

  35. 35.

    Maree JG. Barriers to access to and success in higher education: intervention guidelines. South African J High Educ. 2015;29:390–411.

  36. 36.

    Whiteford G, Shah M, Nair CS. Equity and excellence are not mutually exclusive: a discussion of academic standards in an era of widening participation. Qual Assur Educ. 2013;21:299–310. https://doi.org/10.1108/QAE-Apr-2012-0020.

  37. 37.

    Curtis E, Wikaire E, Jiang Y, McMillan L, Loto R, Fonua S, et al. Open to critique: predictive effects of academic outcomes from a bridging/foundation programme on first-year degree-level study. Assess Eval High Educ Routledge. 2017;42:151–67. https://doi.org/10.1080/02602938.2015.1087463.

  38. 38.

    Ahmady S, Khajeali N, Sharifi F, Mirmoghtadaei ZS. Factors related to academic failure in preclinical medical education: A systematic review. J Adv Med Educ Prof. 2019;7:74–85. https://doi.org/10.30476/JAMP.2019.44711.

  39. 39.

    Arulampalam W, Naylor R, Smith J. Factors affecting the probability of first year medical student dropout in the UK: a logistic analysis for the intake cohorts of 1980-92. Med Educ. 2004;38:492–503. https://doi.org/10.1046/j.1365-2929.2004.01815.x.

  40. 40.

    Yorke M. Leaving early. Undergraduate non-completion in higher education. London and Philadelphia: Falmer Press; 1999.

Download references

Acknowledgements

Not applicable

Funding

Not applicable.

Author information

Affiliations

Authors

Contributions

SM designed and conducted the study. SM and AG analysed the data and wrote the manuscript. Both authors read and approved the final manuscript.

Corresponding author

Correspondence to Ann Zeta George.

Ethics declarations

Ethics approval and consent to participate

Ethics approval for the study was granted by the Human Research Ethics Committee (Medical) of the University of the Witwatersrand (Clearance Certificate: M170490). Permission to access the student data was obtained from the Registrar of the University of the Witwatersrand.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Mabizela, S., George, A. Predictive validity of the National Benchmark Test and National Senior Certificate for the academic success of first-year medical students at one South African university. BMC Med Educ 20, 152 (2020). https://doi.org/10.1186/s12909-020-02059-8

Download citation

Keywords

  • First-year academic success
  • Medical students
  • Selection tests
  • Hierarchical multiple regression
  • National Benchmark Test
  • National Senior Certificate
  • South Africa