Background
Globally, multiple-choice questions (MCQs) remain the dominant method for assessing medical students’ factual knowledge.1 Writing high-quality MCQs is time-consuming and onerous for academic staff, meaning many institutions do not maintain large, regularly updated formative question banks.2 As a result, students increasingly turn to commercial-off-the-shelf educational platforms (known as MedED-COTS). This term encompasses question banks, user-generated flashcards, and interactive multi-media content.3 Examples commonly used in the UK include PassMedicine, PasTest and QuesMed; with studies estimating that close to 90% of students use MedED-COTS during their training.4,5
A mixed-methods study from the US suggests that the perceived utility of MedED-COTS increases with proximity to assessments and as student stress rises, indicating that timing and assessment pressure shape resource selection.6 However, much remains unknown about students’ motives for preferring commercial platforms to free institutional resources, and about how educators view the pedagogical value and risks of these tools.
A focused BEME review reported consistently positive correlations between use of MedED-COTS and exam performance.7 Other reported benefits of MedED-COTS include accessibility, adaptive practice, immediate feedback, and features that promote information retrieval practice (eg, spaced repetition and performance analytics).
Yet several concerns are consistently raised in the literature:3,8
- A tendency to promote surface or rote learning rather than deeper conceptual or clinical reasoning.
- Variable content validity and limited quality control compared with curated institutional resources.
- Financial cost to students, with potential implications for equity and differential attainment.
These tensions sit within broader debates in medical education about curriculum overload, assessment-driven learning, self-directed study, and the equitable provision of learning resources. The BEME review therefore identified specific, actionable gaps:
- Insufficient qualitative evidence on how and why students choose and use MedED-COTS.
- Limited exploration of why students might use these platforms instead of institutional materials.
- A paucity of teaching-staff perspectives on the pedagogic impact of these resources.
- A lack of empirical examination of cost, accessibility and curriculum alignment in UK contexts.
To date only a single study by Finn et al (2022) has interrogated student motivations in depth.9 They found that medical students’ use of MedED-COTS shifts across phases of training, with early reliance on course-specific tools giving way to solitary study using commercial platforms when preparing for licensing exams. Their study also highlighted the dominant role of peer influence in resource selection, and the prioritisation of efficiency and exam preparation over faculty guidance or deeper learning. However, this single in a United States (US) context limits generalisability of findings to United Kingdom (UK) curricula and assessment cultures.
As such, this qualitative, exploratory study aims to address the gaps highlighted in the BEME review by eliciting the perspectives of both students and early career clinical instructors (clinical teaching-fellows) at a single UK medical school. Specifically, we investigate how and why students engage with MedED-COTS, how these platforms compare with institutional resources in practice, and what benefits and limitations are perceived by learners and teachers. The findings are intended to inform evidence-based approaches for integrating commercial platforms into curricula in ways that preserve content validity, promote deep clinical reasoning, and mitigate inequities of access.
Aims and Research Questions
Aims
The aims of this study are to:
- Explore student and teacher perceptions of their relationships with MedED-COTS.
- Strengthen the evidence base on why students use MedED-COTS as a supplement to institutionally-available resources.
Primary Research Question
- Why and how do students use MedED-COTS?
Secondary Research Question
- Why do students use MedED-COTS in lieu of institutional learning materials?
- What limitations do students and teachers report in the use of MedED-COTS?
Methods
Design, Ethics and Study Setting
This was a cross-sectional, qualitative explorative study undertaken using semi-structured focus groups, selected to explore deeper relationships between students and MedED-COTS. Ethical approval was provided by the University of Birmingham (Appendix 1).
The study was conducted at a large teaching hospital site linked to the University of Birmingham Medical School. This hospital provided clinical placement to students at this institution during the academic year 2022/2023. As focus groups were conducted online, participants and investigators did not need to physically visit this location.
The MBChB at the University of Birmingham is delivered as an integrated five-year programme combining foundational biomedical science learning in the first 2 “pre-clinical” years with progressive clinical exposure. The subsequent 3 years (years 3, 4 and 5) emphasise clinical placements, workplace-based learning and application of clinical skills across a range of specialties. Assessment across the programme employs a mix of applied knowledge tests (written examinations), workplace-based assessments and objective structured clinical examinations (OSCEs). These features reflect a programme that balances summative licencing-style assessments (knowledge and skills) with continuous assessment of clinical competence. A graduate entry course (GEC) also existed at the time of study, which enabled postgraduate learners to join the MBChB programme in an accelerated 4-year programme with truncated time spent in the pre-clinical phase.
Participants and Inclusion Criteria
This study aimed to triangulate views from students that use MedED-COTS, and clinical teaching fellows (CTFs). As such, study inclusion criteria accounted for both groups and is summarised in Table 1. Eligible students had to have completed at least one full year of clinical training and were therefore in the clinical phase of the MBChB programme (clinical years).
|
Table 1 Eligibility Criteria for the Study Participants
|
Clinical teaching fellows (CTFs) affiliated with University of Birmingham Medical School are qualified doctors appointed to posts that are primarily educational. Typically, CTF positions allocate most of the working time to teaching clinical students. They are commonly fixed-term roles of around one year in duration. CTFs support undergraduate teaching in hospital and community settings by delivering ward-based and bedside teaching, facilitating small-group seminars, contributing to structured bedside and skills sessions, assisting with formative assessments, and helping to develop and quality-assure teaching materials and assessments for the MBChB. They are supported to pursue postgraduate teaching qualifications such as a Certificate in Medical Education. Given these responsibilities, CTFs provide both front-line teaching and an important perspective on how external learning resources are used and perceived within the clinical curriculum.
Sampling and Participant Characteristics
291 eligible participants were emailed with participant information leaflets detailing the study (Appendix 2). 8 interested participants signed up directly to arrange suitable times for the focus group. During sampling, a purposive snowball methodology was used to ensure that the focus group sizes were adequate, which increased the sample size to 11.
Participants were divided into 2 evenly-sized groups, ensuring that each focus group had at least 4 participants, often stated as the recommended lower limit to prevent imposed pressure to speak.10 Similarly, an upper limit of 8 participants was chosen to ensure varying perspectives without dilution of individual contributions.11 Participants were divided into students and teachers, to avoid power imbalances that may have prevented individuals from engaging.
All participants gave written informed consent prior to participating in the focus groups (Appendix 3), which included publication of their anonymised direct quotes. The first focus group consisted of 5 medical students. The second focus group consisted of 6 CTFs. Participant characteristics are summarised in Tables 2 and 3.
|
Table 2 Participant Characteristics from Focus Group 1. GEC, Graduate Entry Course
|
|
Table 3 Participant Characteristics from Focus Group 2. CTF, Clinical Teaching Fellow
|
Methods of Data Collection
A focus group question guide was constructed through a multi-step process. First, questions that could address the study aims were identified and then grouped into a sequential series using a model from Krueger and Casey.12 A subsequent collaboration with postgraduate diploma students was undertaken to ensure the questions followed a logical and developmental sequence. The questions were used to explore opinions and experiences of focus group attendees but acted only as a semi-structure, with opportunities to seek clarification from participants. Appendix 4 and 5 show these guides.
Two focus groups were conducted using Zoom software and transcribed verbatim using the plugin, Twine. The sessions lasted no longer than 60 minutes. Subsequent proof-reading and anonymisation of the transcripts was performed manually.
Methods of Data Analysis
Anonymised transcripts were inputted to NVivo 12 for thematic analysis. The protocol for data collection and analysis was rigorously adhered to, to ensure credible and trustworthy output. Thematic analysis followed a widely-accepted framework, which involved a six-step approach.13
Firstly, transcripts were read several times by the lead author to achieve data familiarisation. The data was then organised in an open-coding manner, so that codes were generated inductively rather than a priori (excerpt available in Appendix 6). The whole dataset was given equal attention to allow repeated patterns to emerge. Selection of pre-defined themes was undertaken using supporting findings from work by Finn et al.9 This deductive approach allowed these results to build upon previously established work and included the following themes:
- “Finding the best one for the challenge ahead”.
- “Linked to the learning strategies that individuals adopt”.
- “Costs of e-Learning”.
Themes for the secondary research questions were selected using an inductive approach due to absence of supporting literature. All themes were then reviewed during analysis to ensure relevance, coherence, and distinction. This step resulted in some theme modifications due to differences between this data and Finn et al.9 Final themes were concretely defined and linked to their subthemes, before being organised into a narrative report. Quotes are shown verbatim to provide insights into the participants’ thoughts. Efforts have also been made to incorporate group interactions to demonstrated how discussions evolved.
While coding was undertaken by a single researcher, several strategies were used to enhance rigour and ensure credibility. The first author revisited the raw data multiple times to check for consistency and refine the coding framework. Interim coding strategies and developing themes were also shared with peers during regular meetings of a postgraduate medical education programme, providing opportunities for external feedback and challenge. In addition, the coding framework and thematic interpretations were discussed with the second author, whose feedback informed refinement of the final themes. This process ensured consistency across the dataset while also incorporating external perspectives to enhance the trustworthiness of the analysis.
Reflexivity and Philosophy Paradigm
As an early career education researcher, the lead author engaged in reflexive practice to consider how their background, experiences, and assumptions might influence the research process. Reflexive consideration enabled them to recognize potential biases and maintain awareness of how their interpretations could shape the analysis of participants’ experiences.
During the running of this project the lead author was a clinical teaching fellow with a recent experience of use of MedED-COTS for their own learning. The relationship of the lead author to the participants was that of a near-peer (for students) and a colleague (for the other CTFs). This may have influenced the openness of the responses, however some degree of familiarity did help with selecting focus groups as the existing cohesiveness helped to encourage shared reflection through dialogue. To attempt to minimise the bias of their own influence the lead author ensured transparency in theme development and discussed each step of the project with a cohort of postgraduate diploma in medical education students in a structured tutorial setting.
To support identification of the research paradigm, the lead author used the HARP (Heightening Awareness of your Research Philosophy) tool.14 This tool involved reflecting on 30 statements representing different research paradigms to identify alignment with specific philosophical positions. Completing this exercise helped the lead author to clarify their current stance as interpretivist. Interpretivism emphasises understanding participants’ subjective experiences and the meanings they ascribe to their context. It recognizes that knowledge is co-constructed between researcher and participants and that there is no single “objective truth” across settings. This philosophy naturally aligns with inductive coding, as it allows themes to emerge directly from participants’ perspectives.15 However, interpretivism does not preclude the use of deductive coding. In this study, deductive coding was applied for the primary research questions, using the three main themes identified in a comparable US study as a framework. While these categories provided structure, they were interpreted through the lens of participants’ experiences, consistent with an interpretivist approach. For the secondary research questions, which explored areas not previously investigated, inductive coding was used to allow novel themes to emerge from the data. This hybrid coding strategy (deductive for theory-informed primary questions and inductive for exploratory secondary questions) ensured that the study both examined established concepts and captured new, context-specific insights, while remaining philosophically coherent within an interpretivist paradigm.
Findings
Primary Research Question
The focus of this study was to ask why and how students use MedED-COTS.
Theme 1: MedED-COTS Align to the Learner’s Goals
Students consistently described their use of MedED-COTS as purposeful and goal-driven, with particular platforms selected to match the demands of different stages in the academic year and the nature of upcoming assessments. While Finn et al9 previously characterised this as “finding the best one for the challenge ahead,” the current findings extend this theme by highlighting how students’ resource use evolved over time and by emphasising the perception of commercial tools as essential for success.
Subtheme: Different Platforms for Different Purposes
Rather than using a single resource indiscriminately, participants reported matching platforms to specific learning needs. For example, written exam preparation often centred on question banks, whereas OSCE practice was supported by resources offering clinical cases and videos. As one student put it, “I used PassMed for questions, but Geeky Medics for OSCE revision” (P5). Another explained that, while PassMed supported question-based practice, Geeky Medics “provided articles you could actually learn from” (P6). These accounts illustrate a strategic division of labour between platforms, with students aligning tools to the demands of each assessment type.
Subtheme: Changing Use
Participants also described shifting their use of MedED-COTS depending on proximity to examinations. Early in the year, students often turned to resources with less challenging material, while closer to assessments they gravitated towards high-volume question banks.
For instance, one student explained:
I used BiteMedicine and QuesMed earlier in the year … whereas I used PassMed to hammer in loads of questions closer to the exam. (P4)
Another reflected that their strategy became more focused and intense nearer to exams:
This revision season I looked at PassMed, BiteMed and Zero to Finals … it changed as the exams got nearer. (P3)
These shifts highlight a pragmatic approach, with students calibrating resource use in response to perceived exam pressures.
Subtheme: Necessity for Using MedED-COTS
Perhaps most strikingly, students framed MedED-COTS as indispensable for exam success, with several expressing the belief that relying solely on institutional resources would be insufficient. As one participant remarked,
You’re not going to get through clinical years without a question bank … you’re kidding yourself if you do lectures. (P1)
Another went further:
If I had only done the lectures, I would have absolutely failed. (P3)
This sense of necessity positions MedED-COTS not merely as supplementary aids but as central tools in students’ academic survival strategies.
Theme 2: Acceptable Learning Strategies That MedED-COTS Offer
A second major theme centred on the ways in which MedED-COTS align with students’ preferred learning strategies, making them both appealing and effective. Rather than viewing these platforms as radically new, participants described them as familiar, engaging, and structured in ways that resonated with their established study habits. This theme reflects the earlier description from Finn et al9 of “learning strategies that individuals adopt,” but the present study highlights specific features, such as gamification, accessibility, and structured design, all of which enhance their acceptability.
Subtheme: Gamification
Students and CTFs alike commented on the game-like features of many platforms, which appeared to foster engagement and sustain motivation. Question banks were frequently described as “playful”, with the competitive mechanics of scoring and repetition mirroring the dynamics of gaming. One participant captured this sentiment:
I really like how these platforms almost gamify education because medicine is such a big topic. (P4)
Another explained that feedback loops created a cycle of improvement:
It’s easier to absorb information when you’re doing it through a game, getting things wrong and trying to improve. (P6)
These reflections illustrate how gamification transforms repetitive practice into a more interactive and motivating activity.
Subtheme: Emulate Prior Learning Strategies
A strong consensus across groups was that MedED-COTS echoed familiar study practices from school, particularly practising with previous exam papers, termed “past papers”. For many students, this continuity made the platforms intuitively acceptable As one participant reflected,
Even at school, they encouraged you to use past papers. MedED-COTS allow you to translate that into medical school. (P2)
Another noted that medical students, as “good test takers,” found question-based practice to be a natural extension of their established habits (P9). This perceived familiarity positioned MedED-COTS as a comfortable and effective bridge between secondary and higher education.
Subtheme: Accessibility
Ease of access was also central to participants’ positive appraisal of MedED-COTS. Students valued the ability to integrate revision seamlessly into daily life, particularly through mobile devices. For example, one explained:
I would just get it on my phone and do it on the train into Uni … just flicking through questions on my phone. (P1)
Another highlighted the flexibility of on-the-go study:
It offers you that opportunity to dip in and dip out, but still active learning. (P7)
Such accounts demonstrate how accessibility allowed students to transform otherwise unproductive moments into revision opportunities.
Subtheme: Organised Structure
Participants frequently praised the structured presentation of content, noting that commercial platforms simplified complex curricula into digestible formats. For some, this reduced stress associated with navigating institutional resources. One student summarised:
There’s a good format; you can follow it. Everything is there, and I don’t need to stress about looking for things. (P4)
Others described the benefit of centralisation, where resources felt more “digestible” and user-friendly (P3). This emphasis on order and structure underscores how MedED-COTS offer not just content, but a framework that makes learning more manageable.
Subtheme: Identifying Weaknesses
Finally, students valued the diagnostic function of question banks in highlighting knowledge gaps. By revealing weak areas, MedED-COTS guided learners towards more targeted revision. As one participant explained,
It allows you to identify what it is that you don’t know. (P10)
Another reinforced this:
Question banks are a good way of identifying your areas of weakness, and then you can focus using other resources. (P10)
This ability to personalise learning further reinforced the platforms’ acceptability and perceived effectiveness.
Theme 3: The Social Influences and Social Benefits of MedED-COTS
A further theme highlighted the social dimensions of MedED-COTS, particularly the ways in which peer culture influenced adoption and use. While Finn et al9 had embedded peer influence within the broader theme of “finding the best one for the challenge ahead,” the frequency and emphasis in this study suggested it warranted recognition as a distinct theme. Students described peers as both trusted sources of guidance and powerful drivers of conformity, with MedED-COTS sometimes framed as a near-mandatory component of medical student life.
Subtheme: Peer Influences
Recommendations from senior students played a major role in shaping the adoption of specific platforms. Participants consistently valued advice from those ahead of them in training, often regarding such endorsements as reliable indicators of what would be most useful. One student explained,
One of my friends from another year recommended to me … ‘You’re not going to get through clinical years without a question bank’” (P1), while another added, “I value a resource more if people from the years above have said, ‘This is what I use’. (P5)
At the same time, this influence sometimes tipped into perceived pressure. Several participants acknowledged feeling compelled to subscribe to platforms simply because peers endorsed them, describing a “herd mentality” (P9). As one reflected,
When everyone else is using resources … there is a part of you that feels like you have to use them so you’re not behind. (P2)
Others echoed this sense of coercion:
My main reason was literally peer pressure. My friend was like, ‘You have to get this, you’re an idiot if you don’t’. (P1)
These accounts suggest that while peer guidance can be supportive, it also fosters normative expectations that may drive uptake beyond individual preference.
Subtheme: Social Benefits
Beyond peer pressure, students also identified positive social dimensions embedded within the design of some platforms. Interactive features, such as discussion or comment sections, allowed users to question or correct content collaboratively. One participant observed:
There’s a comment section … people come in to say, ‘Actually, I’m not sure this is right’. It self-audits itself. (P4)
Such features were seen as enhancing accuracy and creating a sense of shared responsibility for learning, reflecting how MedED-COTS can function not only as study tools but also as small communities of practice.
Secondary Research Questions
Theme: Limitations of University Resources
In exploring why students turned to MedED-COTS rather than institutionally provided materials, a recurring theme was the perceived shortcomings of university resources. Students described these as outdated, insufficiently detailed, or pitched at an inappropriate level, which undermined their usefulness and drove reliance on commercial alternatives.
Subtheme: Inaccurate Information or Lacking Detail
Several participants expressed concerns about the accuracy and depth of university teaching materials. In some cases, students found the content out of date, leaving them feeling frustrated and uncertain about its reliability. One student recalled,
I realized that they weren’t actually cutting it at all. They were very outdated … I memorized one treatment pathway and then realized that that had changed already and it was outdated. (P3)
Others noted that materials often provided only a superficial overview:
I do think that they barely scraped the surface of what we were supposed to know. (P1)
Such concerns extended to lectures, with some questioning whether faculty remained attuned to the current realities of medical student assessments. As one participant suggested,
A lot of their lecturers … haven’t been in Med School for quite a while, so they don’t really know what’s going on as much as older years. (P5)
Collectively, these accounts suggest that gaps in accuracy and relevance contributed to the perception that university resources could not be relied upon as primary preparation tools.
Subtheme: Knowledge Prerequisites
Students also reported that institutional teaching frequently assumed a level of prior knowledge that was not always present, creating additional barriers to engagement. Sessions were described as pitched too high, with little effort to review foundational concepts. One student reflected,
When consultants and Med school were delivering lectures, they assume a baseline level of knowledge. (P2)
while another added,
Sometimes they miss the baseline understanding you’re expected to have, because in preclinical you learn so much you can’t remember it all. (P3)
This perceived disconnect between lecturer expectations and student preparedness meant that institutional resources were sometimes experienced as inaccessible or overwhelming. In turn, students sought out MedED-COTS, which they felt offered clearer explanations, better scaffolding, and a closer alignment to their immediate learning needs.
Theme: Limitations of MedED-COTS
Alongside their many perceived benefits, both students and teachers identified important drawbacks of MedED-COTS. These concerns clustered around issues of financial burden, the necessity of combining multiple platforms, and questions about the validity and real-world applicability of the content. While students were most vocal about cost and practicality, CTFs emphasised potential limitations for deeper learning and clinical preparedness.
Subtheme: Financial Implications
For many students, the most immediate challenge associated with MedED-COTS was financial. Subscriptions to multiple platforms were described as a significant additional expense, particularly given the already high cost of medical tuition. One student reflected,
I pay a lot for tuition … and then to have to pay often like a subscription for PasTest, subscription for XYZ, I prefer not to have to do that … I was not happy about it because we already paid tuition, and so I was a bit sad. (P3)
Others echoed this frustration, highlighting the trade-offs they made to limit expenses. As P1 put it,
I had the same issue where I was like, ‘Oh, God, I’m repeating these questions a lot now’, but I didn’t want to get another question bank, because I was already paying for one.
These accounts illustrate how cost barriers can restrict students’ access to the breadth of resources they perceive as necessary for exam preparation. Notably, the commentary on this section mainly came from students and was only briefly mentioned by the CTFs, who may have already disconnected from the financial hardships of being a student.
Subtheme: Reliance on Multiple Platforms
Despite the financial burden, many participants reported that no single platform was sufficient to meet their needs. Instead, they described navigating between multiple MedED-COTS, each offering different strengths but also presenting discrepancies. For example, one student noted,
I still don’t feel like there’s like a ‘True North’ to go to in terms of content because they all use slightly different approaches. (P3)
This inconsistency between resources was a source of frustration, as highlighted by P6:
I find discrepancies or differences in the materials online so that students tend to get a bit … ‘Geeky Medics says X, but BiteMedicine says Y’.
For many, the perceived solution was to combine two or three platforms, creating a patchwork of resources that provided coverage across topics but also added to cost and cognitive load.
Subtheme: Validity and Real-World Applicability of MedED-COTS
While students focused primarily on issues of cost and comprehensiveness, teaching fellows raised concerns about the broader educational validity of MedED-COTS. They questioned whether the content reflected authentic clinical practice, with one observing,
…the questions often reflect those who have made the question banks, as opposed to what you might see in clinical practice. (P7)
Some worried that over-reliance on question banks might displace experiential learning:
I don’t think it helps students… I think they spend more time on devices answering questions than actually meeting people. (P8)
Others highlighted how the format risks oversimplification, creating rigid “illness scripts” without the context required for nuanced understanding. As one fellow described,
It [creates] … a very specific illness script that puts patients and presenting complaints into neat little boxes, but doesn’t give you the context to explore why. (P9)
They explained this in terms of assessment performance versus clinical reasoning:
I call it PassMed-itis … they know some very specific thing about hereditary spherocytosis … but they can’t tell me how a patient would present with anaemia … they lack the ability to draw connections between body systems and presenting complaints. (P9)
Such concerns underline a tension between the exam-focused strengths of MedED-COTS and their perceived limitations in fostering holistic clinical competence.
Discussion
Primary Research Question
This qualitative study investigated why and how students use MedED-COTS through exploration of student and teacher insights. The results represent complex interactions from focus group data.
Consistent with prior work,9,16 students reported using MedED-COTS primarily to optimise exam preparation. Different platforms were selected depending on the type of assessment, proximity to exams, and the specific learning goal, reflecting the strategic and self-directed nature of resource use. Our study extends these findings by highlighting financial burden as a significant barrier, and the need to combine multiple platforms to cover the curriculum comprehensively, observations less frequently reported in US settings.9
Interestingly, a newly published quantitative study in the UK has identified that 64% of medical students in the UK (from 915 respondents across 30 of the 43 established medical schools) cite financial concerns around use of commercial MedED-COTS.17 As such, it may be that the findings of our study are limited to the UK.
MedED-COTS appeal to students due to alignment with recognised learning strategies. Gamification and accessibility features were highly valued, supporting active engagement, spaced repetition, and self-testing, consistent with cognitive psychology literature on effective learning techniques.18,19 Additionally, participants reported that these platforms mirrored learning strategies used in high school, emphasising the strategy of learning through practice with previous examination papers (past papers). This could suggest that students leverage familiar scaffolds when navigating new learning environments, a core principle of constructivist theory.20
Another frequently reported advantage reported in this trial was the accessibility of MedED-COTS to enable “on-the-go” learning in piecemeal amounts. It is unsurprising that with an exponentially growing curriculum, coined “curriculum hypertrophy” that students praised components that enabled faster, smarter and more convenient methods for their learning.21
Social dynamics were also pivotal. Peer influence shaped platform choice, while the platforms themselves incorporated social features such as comment sections, enabling a form of collaborative quality control. These findings can be interpreted through social constructivist and communities of practice frameworks, whereby knowledge is co-constructed through interaction and social guidance.22,23 Notably, some students described feelings of peer pressure or herd mentality, highlighting the dual nature of social influences.
These results contribute towards a topic area with minimal supporting theory. Prior research by Finn et al9 on a similar question provided themes from their analysis, although did not contribute a formal theoretical framework. Collating their work with this has strengthened the topic area by providing further insights into the mechanics of why and how MedED-COTs are used (Figure 1).
|
Figure 1 A diagram illustrating the summarising themes from studies exploring how and why students use MedED-COTS.
|
Secondary Research Question(s)
This study explored perceived limitations in institutionally provided education resources. A key criticism was the inaccuracy and outdated nature of the university materials, impacting students’ learning of contemporary treatment algorithms. Pedagogical efficacy of the lecturers was also questioned, as some felt that seniors have become disconnected from the reality of medical school and its current exam formats. A corollary to the issue of lecturers was that students often felt that the institutional resources often assumed prior knowledge that led to gaps in understanding. These were directly contrasted against MedED-COTS, which often began with the foundational basics and enabled a deeper, more quality learning experience. This phenomenon could again be explained with constructivist theory, where if students are not provided with a foundational scaffold to build upon, they will not be able to acquire new information as effectively.20
Limitations of MedED-COTS, as perceived by students and teachers, were also explored. Students emphasised financial implications as a critical barrier to their use of MedED-COTS, with expressed frustration of the added burden to their existing tuition fees. Teachers, alternatively, were far more concerned with the validity of these platforms. Criticisms were made of the oversimplification of complex topics that they felt prevented deeper learning, and failure to equip students with the skills to pass practical exams. These latter points are supported by other articles, which suggest the focus on memorisation of facts does not promote active learning.8 Other limitations outlined in the literature, not found in this study, are a lack of rigorous quality control by commercial companies, and a failure to individualise to the specific curricula.7,16,17
Implications for Medical Education
These findings demonstrate an exploratory perspective of a small cohort and highlight a need for further exploration into the integration of MedED-COTS by more institutions to support these results. Clear advantages include up-to-date information and learning functionalities that students benefit from to pass their exams. Integration of these platforms by medical schools would abolish the disadvantage of individual cost. However, ensuring the validity and accuracy of these platforms would remain a challenge and require thorough assessment prior to purchasing.
To translate these findings into practice, several strategies for integrating MedED-COTS into institutional teaching can be considered. These are summarised in Table 4.
|
Table 4 Practical Recommendations and Rationale for Institutions to Engage with MedED-COTS and Consider Integration Within Their Healthcare Professional Courses
|
Strengths and Limitations
This study has offered insights into the use of MedED-COTS from key stakeholders. To best knowledge, it is one of the first studies to have explored this question. This study has utilised the SRQR and COREQ guidelines to ensure rigorous reporting of this qualitative study and to help improve its quality (Appendix 7 and 8).24,25
Use of focus groups enabled unique perspectives to offer “thick, rich” data on the topic, and also explore inter-relational dynamics between participants.26,27
Furthermore, use of thematic analysis enabled a flexible approach to how analysis was conducted,28 with a hybrid use of inductive coding and deductive theme selection. This allowed operationalisation of previous work by Finn et al9 to help build current insights.
The study is not without limitations. Conducting only 2 focus groups with a small number (n=11) participants from a single institution limits the generalisability of the results. However, within this cohort there was representation from local and international students and medical graduates, as well as undergraduate and graduate-entry students providing a heterogeneity of opinion. Given the design of this study, with only one focus group of students and one of CTFs, we were unable to achieve data saturation. While prior work has suggested that two to three homogeneous groups can yield a majority of themes, our inclusion of two distinct participant types limits the applicability of that principle.29 As such, these findings should be considered exploratory and hypothesis-generating, rather than definitive. A further method to improve this study would have been to triangulate the results with quantitative data. Notable studies that would help develop these focus group findings would include surveying to capture prevalence of student use of various platforms used, as well as frequency and correlation with outcomes, such as exam results and professional progression.
Conclusion
In summary, this qualitative study has identified student and teacher perspectives on why and how medical students use MedED-COTS. While these platforms are perceived as indispensable, issues of cost, equity, and validity remain, highlighting the need for careful consideration in their role within medical education. Further research, particularly larger or mixed-methods studies across multiple institutions, will be essential to triangulate the findings from this small exploratory study and develop a robust theoretical framework.
For educators and policymakers, the findings emphasise that universities could play an active role in shaping student use of MedED-COTS. Practical steps include negotiating institutional licenses to reduce inequity, embedding vetted resources within curricula, and ensuring quality assurance to address concerns about accuracy and clinical applicability. Such measures would help balance the efficiency of exam-focused preparation with the deeper conceptual and clinical reasoning skills required for medical practice.
Data Sharing Statement
Data is stored on a Research Data Storage (database) on the University of Birmingham data repository.
Ethics Approval and Informed Consent
Ethical approval was granted by the research ethics committee at the University of Birmingham prior to commencing this study.
Consent for Publication
All participating individuals provided informed consent with signed forms, acknowledging the potential dissemination of their contributions to other researchers in conferences or publications.
Author Contributions
All authors made a significant contribution to the work reported, whether that is in the conception, study design, execution, acquisition of data, analysis and interpretation, or in all these areas; took part in drafting, revising or critically reviewing the article; gave final approval of the version to be published; have agreed on the journal to which the article has been submitted; and agree to be accountable for all aspects of the work.
Funding
There were no sources of funding for this project.
Disclosure
There are no financial or non-financial competing interests for either author.
References
1. Buttery CM. Assessment in medical education. N Engl J Med. 2007;356:2109.
2. Ware J, Vik T. Quality assurance of item writing: during the introduction of multiple choice questions in medicine for high stakes examinations. Med Teach. 2009;31:238–243. doi:10.1080/01421590802155597
3. Burk-Rafel J, Santen SA, Purkiss J. Study behaviors and usmle step 1 performance: implications of a student self-directed parallel curriculum. Acad Med. 2017;92(11S Association of American Medical Colleges Learn Serve Lead: Proceedings of the 56th Annual Research in Medical Education Sessions):67–74. doi:10.1097/ACM.0000000000001916
4. Vora A, Maltezos N, Alfonzo L, Hernandez N, Calix E, Fernandez MI. Predictors of scoring at least 600 on COMLEX-USA level 1: successful preparation strategies. J Am Osteopath Assoc. 2013;113(2):164–173. doi:10.7556/jaoa.2013.113.2.164
5. Seal ZA, Koek W, Sharma R. Correlation of medical college admission test scores and self-assessment materials with the United States medical licensing examination step 1 performance. Cureus. 2020;12(4):e7519. doi:10.7759/cureus.7519
6. Yavner SD. Stress, Fatigue, and Medical Students’ Study Resource Selection: Implications for the Design of Educational Multimedia. New York (NY): New York University; 2016.
7. Hirumi A, Horger L, Harris DM, et al. Exploring students’ [pre-pandemic] use and the impact of commercial-off-the-shelf learning platforms on students’ national licensing exam performance: a focused review – BEME Guide No. 72. Med Teach. 2022;44(7):707–719. doi:10.1080/0142159X.2022.2039380
8. Harris BHL, Walsh JL, Tayyaba S, Harris DA, Wilson DJ, Smith PE. A novel student-led approach to multiple-choice question generation and online database creation, with targeted clinician input. Teach Learn Med. 2015;27(2):182–188. doi:10.1080/10401334.2015.1011651
9. Finn E, Ayres F, Goldberg S, Hortsch M. Brave new e-world: medical students’ preferences for and usage of electronic learning resources during two different phases of their education. FASEB Bioadv. 2022;4(5):298–308. doi:10.1096/fba.2021-00124
10. Stalmeijer RE, McNaughton N, Van Mook W. Using focus groups in medical education research: AMEE Guide No 91. Med Teach. 2014;36(11):923–939.
11. Barbour RS. Making sense of focus groups. Med Educ. 2005;39(7):742–750. doi:10.1111/j.1365-2929.2005.02200.x
12. Krueger R, Casey M. Focus Groups: A Practical Guide for Applied Research. Thousand Oaks, CA: Sage Publications; 2009.
13. Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol. 2006;3(2):77–101. doi:10.1191/1478088706qp063oa
14. Bristow A, Saunders M. Heightening awareness of research philosophy: the development of a reflexive tool for use with students. Br Acad Manag. 2014.
15. Saunders M, Lewis P, Thornhill A. Research Methods for Business Students. 7th ed. Pearson Education; 2014.
16. Fisher J, Leahy D, Lim JJ, et al. Question banks: credit? Or debit? A qualitative exploration of their use among medical students. BMC Med Educ. 2024;24:569. doi:10.1186/s12909-024-05517-9
17. Vernon M, Hawwash N, Haque E, et al. Pay to win? Exploring medical students’ use of, and access to, paid commercial educational resources. BMC Med Educ. 2025;25:738. doi:10.1186/s12909-025-07233-4
18. Roediger HL 3rd, Butler AC. The critical role of retrieval practice in long-term retention. Trends Cognit Sci. 2011;15(1):20–27. doi:10.1016/j.tics.2010.09.003
19. Cook DA, Levinson AJ, Garside S, Dupras DM, Erwin PJ, Montori VM. Internet-based learning in the health professions: a meta-analysis. JAMA. 2008;300(10):1181–1196. doi:10.1001/jama.300.10.1181
20. Bruner J. The Process of Education. Cambridge, MA: Harvard University Press; 1960.
21. Abrahamson S. Diseases of the curriculum. J Med Educ. 1978;53(12):951–957. doi:10.1097/00001888-197812000-00001
22. Vygotsky L. Interaction between learning and development. Mind Society. 1978;79–91.
23. Lave J, Wenger E. Situated Learning: Legitimate Peripheral Participation. Cambridge: Cambridge University Press; 1991.
24. O’Brien BC, Harris IB, Beckman TJ, Reed DA, Cook DA. Standards for reporting qualitative research: a synthesis of recommendations. Acad Med. 2014;89(9):1245–1251. doi:10.1097/ACM.0000000000000388
25. Tong A, Sainsbury P, Craig J. Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups. Int J Qual Health Care. 2007;19(6):349–357. doi:10.1093/intqhc/mzm042
26. Geertz C. The Interpretation of Cultures. London: Hutchinson; 1973.
27. Morgan DL. Focus Groups as Qualitative Research. Newbury Park, California: Sage; 1988.
28. Clarke V, Braun V. Thematic analysis. J Posit Psychol. 2017;12(3):297–298. doi:10.1080/17439760.2016.1262613
29. Guest G, Namey E, McKenna K. How many focus groups are enough? Building an evidence base for nonprobability sample sizes. Field Methods. 2017;29(1):3–22. doi:10.1177/1525822X16639015
link

