Next Article in Journal
Estimating the Thermal Conductivity of Unsaturated Sand
Previous Article in Journal
The Investigation of Various Flange Gaps on Wind Turbine Tower Bolt Fatigue Using Finite-Element Method
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Impact of AI-Based Course-Recommender System on Students’ Course-Selection Decision-Making Process

1
Department of Applied Artificial Intelligence, Seoul National University of Science and Technology, Seoul 01811, Republic of Korea
2
Department of Computer Science, Electrical Engineering and Mechatronics, ZHAW Zurich University of Applied Sciences, 8401 Winterthur, Switzerland
*
Author to whom correspondence should be addressed.
Submission received: 8 March 2024 / Revised: 17 April 2024 / Accepted: 23 April 2024 / Published: 25 April 2024

Abstract

:
The course-recommender system (CRS), designed to aid students’ course-selection decision-making process by suggesting courses aligned with their interests and grades, plays a crucial role in fulfilling curricular requirements, enhancing career opportunities, and fostering intellectual growth. Recent advancements in artificial intelligence (AI) have empowered CRSs to deliver personalized recommendations by considering individual contexts. However, the impact of AI-based CRS on students’ course-selection decision-making process (inter alia, search and evaluation phases) is an open question. Understanding student perceptions and expectations of AI-based CRSs is key to optimizing their decision-making process in course selection. For this purpose, we employed speed dating with storyboards to gather insights from 24 students on five different types of AI-based CRS. The results revealed that students expected AI-based CRSs to play an assistive role in the search phase, helping them efficiently complete time-consuming search tasks in less time. Conversely, during the evaluation phase, students expected AI-based CRSs to play a leading role as a benchmark to address their uncertainty about course suitability, learning value, and serendipity. These findings underscore the adaptive nature of AI-based CRSs, which adjust according to the intricacies of students’ course-selection decision-making process, fostering fruitful collaboration between students and AI.

1. Introduction

In higher education, university students’ course-selection decision-making is an important process that has a profound impact on their curricular requirements, career opportunities, and intellectual growth [1,2]. However, students often grapple with the challenge of navigating course listings, overwhelmed by the abundance of information related to each course, making it difficult to quantitatively assess which course aligns best with their needs and aspirations [3]. Although academic advisors can help students select appropriate courses, it is difficult to give this support to all students as it requires a lot of time and effort from the human advisor [4]. In this context, the implementation of artificial intelligence (AI)-based course-recommender systems (CRS) emerges as a solution to these issues [5]. These systems can automatically provide personalized course recommendations to individual students in real time, leveraging their historical data [6,7].
Historically, CRSs have predominantly relied on conventional data-driven recommendation algorithms, including content-based, collaborative filtering-based, and hybrid approaches [8]. While effective in maintaining students’ existing course-selection patterns by offering data-driven recommendations, these approaches are not without their shortcomings [9]. They can inadvertently introduce problems like filter bubbles and algorithmic bias, potentially preventing students from discovering valuable yet unpopular courses that could enrich their intellectual growth [10,11]. To prevent these issues and ensure a more balanced course-selection decision-making process, there is a growing need for a new breed of AI-based CRS that departs from the traditional data-driven model.
In recent times, efforts have intensified to address these limitations by integrating Large Language Models (LLMs), like ChatGPT, into recommender systems and exploring novel recommendation methodologies [12,13]. LLM-powered recommender systems have the potential to decipher intricate patterns within students’ textual interactions, discern nuanced preferences, and align with their learning objectives [14,15]. Leveraging the capabilities of LLM in recommender systems is anticipated to deliver adaptive and context-aware recommendations, therefore tackling the challenges associated with evolving student needs and mitigating issues related to a cold start and data sparsity [16,17]. However, it is equally important to consider how these systems can foster meaningful collaboration between students and AI, with a focus on user-centered design [18,19]. Establishing a harmonious partnership between students and AI in the realm of education is critical for ensuring a seamless and effective personalized learning experience that aligns with the diverse needs of students [20]. From this perspective, it is important to understand the impact of AI-based CRSs on students’ course-selection decision-making process [21].
To address this need, we employed the speed dating research method [22] to expose students to different types of AI-based CRSs and subsequently assess their perceptions and expectations of AI-based CRSs during the course-selection decision-making process [23]. The results of the speed dating activity were analyzed through a thematic analysis approach [24]. We found that students wanted AI-based CRSs to retrieve, filter, and visualize high-quality information related to their courses during the course-search phase and quantify difficult-to-evaluate qualitative information during the course-evaluation phase. These findings showed that students expect AI-based CRSs to play different roles at each phase of the course-selection decision-making process.
This paper is organized as follows: Section 2 delves into the course-selection decision-making process and reviews literature related to AI-based CRSs. Section 3 outlines the experimental procedure employing the speed dating research method. Section 4 sheds light on what students expect from AI-based CRSs at different stages of the course-selection decision-making process. Finally, Section 5 presents insights into the future of AI-based CRSs, aiming to assist students in their course-selection decisions based on our research findings.

2. Related Work

This paper explores the impact of AI-based CRSs on students’ course-selection decision-making process. Building upon existing research, we established a theoretical framework to analyze students’ course-selection decision-making process. We then reviewed AI-based CRSs currently used in higher education.

2.1. Course-Selection Decision-Making Process

In higher education, a student’s course-selection decision-making involves the process of gathering course-related information and determining whether to enroll in it or not [25]. This process comprises several stages: recognizing educational needs, searching for relevant information, evaluating available alternatives, and ultimately deciding on a course [26]. With the advent of the digital age, the widespread use of social media has added complexity to how students acquire, compare, and opt for courses [27]. Othman and colleagues [1] have structured university students’ course-selection decision-making process into a three-phase framework, which includes the search, evaluation, and choice/decision phases, each outlined in Table 1.
During the search phase, students collect course-related information such as course content, instructor reputation, scheduling details, and more from various sources. These sources include syllabi, course descriptions, peer reviews from fellow students [29], and word-of-mouth recommendations. Moving on to the evaluation phase, students compare courses based on criteria such as expected workload, educational value, level of difficulty, and personal interest [26]. Social media also emerges as a significant resource during this phase, aiding students in assessing and comparing courses [27]. Finally, in the choice/decision phase, students make conclusive decisions about which courses to enroll in based on their assessments and evaluations.
The successful decisions students make when selecting courses are of paramount importance as they guarantee the fulfillment of academic requirements, enhance career prospects, and foster intellectual growth [1,2]. Recently, the research community has proposed CRSs as a remedy to support students in making effective and efficient course selections, therefore empowering them [30]. In particular, there has been considerable focus on utilizing automated AI applications for course recommendations within the realm of education [31].

2.2. AI-Based Course-Recommender Systems

AI-based CRSs have long been studied for their potential to aid students in making informed course-selection decisions tailored to their academic paths. Many universities now deploy AI-based CRSs; for instance, the University of California, Berkeley, offers a system named AskOski [32]. This system assists students in exploring their interests, linking course concepts across departments, and planning future semesters in accordance with their program requirements. Recent literature [7] highlights a multitude of recommendation techniques—ranging from collaborative filtering to knowledge representation, fuzzy logic, and various machine learning algorithms—along with input parameters such as learning style, objectives, preferences, and academic information, all tailored to cater to the diverse educational needs of students. For example, collaborative filtering, widely employed in AI-based CRSs, suggests courses based on community records of peers with similar characteristics [33,34]. Similarly, content-based filtering techniques aid in recommending courses aligning with students’ academic interests [35,36]. Moreover, AI-based CRSs extend their utility to suggest courses facilitating the acquisition of competencies relevant to students’ future careers [37,38]. Conversely, some AI-based CRSs focus on optimizing academic outcomes by identifying courses with the highest probability of yielding favorable results and recommending them [39]. Additionally, AI-based CRSs explore students’ latent interests by suggesting courses that are unexpected yet intriguing—a concept often termed serendipity [40].
Although various forms of AI-based CRSs have been developed, their impact in adequately supporting students’ course-selection decision-making process remains uncertain. The crux lies in comprehending students’ perceptions of these AI-based CRSs and pinpointing areas where their expectations are not met during the course-selection decision-making journey. In this context, Zawacki-Richter and colleagues [31] emphasized the necessity of investigating how real users (i.e., students) perceive AI applications within higher education. Similarly, Lee and colleagues [41] discovered that students emphasize academic interests in choosing elective courses and require clear keyword information about courses due to often vague titles. Nevertheless, gaps persist in our comprehension of how AI-based CRSs influence students’ decision-making processes. To address these concerns, we employed speed dating with storyboards, an exploratory research method [22], to expose students to the aforementioned AI-based CRSs [42]. This approach facilitated a deeper exploration of how students envision AI-based CRSs augmenting their course-selection decision-making process at each phase.

3. Materials and Methods

In this study, we employed the speed dating research method, where students are presented with potential future scenarios through a four-cut storyboard [22]. This approach is valuable as it allows students to explore various AI-based CRS scenarios without the need for actual implementation [42]. By exposing students to diverse AI-based CRSs, we aimed to uncover their perceptions and expectations at different phases of the course-selection decision-making process, including the search and evaluation phases [23]. Through this investigation, we sought to discern the impact of AI-based CRSs on students’ course-selection decision-making process. Overall, we address the following research question: “What are the perceptions and expectations of students regarding AI-based CRSs during their course-selection decision-making process, specifically in the search and evaluation phases?”

3.1. Creating Storyboards

To develop technically viable and effective storyboards for AI-based CRSs in higher education, we organized an online brainwriting session [43]. This session involved four researchers, each a faculty member with an average of 13.3 years (SD = 5.4 years) of experience in learning science. They collaboratively developed scenarios for potential AI-based CRSs using a Google Slides file, building upon previous studies outlined in Section 2. The scenarios were iteratively exchanged and refined among the researchers seven times, ensuring a consensus on their technical feasibility and potential effectiveness in course recommendations.
Furthermore, to validate these initial scenarios, we conducted semi-structured interviews with six AI experts who had an average of 18.6 years (SD = 6.3 years) of research experience. These interviews were facilitated via Zoom by the corresponding author. Each scenario was presented to the AI experts, who were then prompted with the questions: “Could you enhance this scenario to ensure its technical feasibility?” and “Could you refine this scenario to enhance course recommendations for students based on your AI research experience?” Following the presentation of all scenarios, the AI experts were asked: “Do you have any suggestions for new scenarios?” Adjustments were made to the original scenarios based on the feedback received, aligning them with expert insights. On average, these interviews lasted 32 min (SD = 5.6 min), and each expert received a compensation of 100 USD for their participation.
As shown in Table 2, five scenarios outlining AI-based CRSs for guiding students through the course-selection decision-making process, including the search and evaluation phases, were derived. Four scenarios (Scenarios 1, 2, 3, and 5) have well reflected the representative AI-based CRSs identified in Section 2. The Serendipity-based CRS (Scenario 4) was created based on insights from AI experts’ research. It is important to note that these five scenarios were not intended to comprehensively cover all AI-based CRSs in higher education or to systematically address all related topics. Rather, their purpose was to delve into students’ course-selection decision-making processes and their perceptions and expectations of AI-based CRSs.
We generated four-cut storyboards based on the scenarios presented in Table 2. As demonstrated in Figure 1, an example storyboard is shown, providing a visual depiction of a scenario with accompanying captions. To address concerns related to gender and ethnic biases and to enhance participants’ identification with the characters, we employed a uniform visual style characterized by flat cartoon shading [23,42]. For a comprehensive repository of AI-based CRS storyboards, please visit https://osf.io/bwg7c/?view_only=8b5122454ced444fa53fad3940f929d6 (accessed on 7 March 2024).

3.2. Speed Dating

3.2.1. Participants

Next, we conducted a speed dating activity using storyboards. For a speed dating activity, we recruited 24 participants (see Table 3) from the university community. For diversity, we recruited students from different grades (i.e., seven freshmen, nine sophomores, five juniors, and three seniors) and 16 different majors (electronics, nursing, education, economics, and computer science). We required participants to have at least two semesters of course-selection decision-making experience at the university to maintain consistency in expectations and experiences regarding course recommendations. However, we did not require any prior knowledge of AI from the participants, as our aim was to capture their unfiltered experiences and expectations for AI-based CRSs. Additionally, previous studies showed that speed dating activity works well without prior knowledge or experience with AI [42]. Prior to participation, informed consent was obtained from each individual, and participants were remunerated with a coffee voucher worth 10 USD for their time. The Institutional Review Board at Seoul National University of Science and Technology approved the entire process.

3.2.2. Procedure

Figure 2 illustrates the comprehensive process of our experiment. We conducted semi-structured interviews with participants through a video conferencing platform (i.e., Zoom). We briefly introduced the purpose of this study and started speed dating activity using storyboards. We showed participants each storyboard in random order. The storyboard describes possible interaction where an AI-based CRS supports a student’s course-selection decision. This setup allowed participants to indirectly experience how five distinct types of AI-based CRSs facilitate decision-making. After participants read each storyboard aloud, we asked the following questions to investigate their thoughts on specific AI-based CRS represented in individual storyboards: “How do you think this AI-based CRS will affect your course-selection decision-making?” and “Do you have any expectations or concerns about this AI-based CRS?” After participants reviewed all five storyboards, wrap-up questions were asked to find out general expectations for AI-based CRSs: “Which AI-based CRS do you think will help students’ course-selection decision-making?” and “How do you think AI-based CRS will change students’ course-selection decision-making?” All the questions were open-ended. Furthermore, we asked follow-up questions such as “Why did you think this scenario would help your course-selection decision-making?” after initial responses were given. These probing questions aimed to deepen participant engagement and elicit more detailed insights. The entire interview lasted around 30.1 min (SD = 6.9 min), with 4–6 min spent sharing each storyboard and probing participants on its specific implications.

3.2.3. Data Analysis

Each interview was audio recorded and then transcribed using software (i.e., NAVER CLOVA Note 1.9.0) for analysis. The transcribed interview data were analyzed by two authors (first author and corresponding author) using the thematic analysis approach [24]. After becoming familiar with the interview data, the first and corresponding authors generated an initial set of semantic codes associated with interesting statements or phrases from the data. The two authors then coded the entire interview dataset via Google Docs to identify additional patterns through an extended investigation of the dataset. Any conflicts that occurred during this coding process were resolved through discussions between the two authors. The two authors came up with 12 final codes through a total of four discussions. Then, all three authors had three iterative discussions to derive six final themes from 12 codes. These themes, which represent students’ perceptions and expectations of AI-based CRSs, were categorized according to the students’ phase of the course-selection decision-making process [1]. Specifically, in the course-search phase, students expressed expectations for (1) Subject-based course retrieval, (2) Social information filtering, and (3) Course-related text visualization from the AI-based CRS. Conversely, in the course-evaluation phase, students anticipated (4) Course suitability analysis, (5) Value proposition, and (6) Serendipity.

4. Results

Our findings revealed that students prefer to actively engage with the AI-based CRS, utilizing it for both information search and evaluation rather than merely receiving passive course recommendations [28,44]. Specifically, during the search phase, students anticipate that the AI-based CRS will effectively retrieve, filter, and visualize high-quality course-related information, therefore enhancing their personalized course searches. In the evaluation phase, students expect the AI-based CRS to quantitatively assess difficult-to-evaluate qualitative aspects such as course suitability, learning value, and serendipity. Table 4 provides a summary of students’ expectations and responses to the AI-based CRSs at each phase of the course-selection decision-making process.

4.1. Search Phase

During the course-selection decision-making process, the search phase involves students searching for information about courses from various sources such as syllabi, course descriptions, peer reviews, and word of mouth. In this phase, students anticipate that the AI-based CRS will assist them in efficiently locating desired courses, filtering out unreliable social information pertaining to the courses, and analyzing and visualizing textual information, as outlined below.

4.1.1. Subject-Based Course Retrieval

Students emphasized the necessity for ‘subject-based course retrieval’ to streamline the process of finding desired courses. Presently, most course enrollment systems rely on title-based course retrieval methods, which students find limiting. As one participant noted, “the title and content of the course often do not match (S23)”, making it challenging to identify courses solely based on titles. Another student (S24) expressed frustration, stating, “It is difficult to find a course of interest only by looking at the title alone”. Similarly, students S13 and S17 recounted instances where they mistakenly enrolled in courses with content vastly different from their expectations based on the course title. To avoid such errors, students S15 and S20 stressed the importance of thoroughly reading through the course syllabus. Consequently, most students (17 out of 24) expressed their expectation that an AI-based CRS offering subject-based course retrieval would significantly streamline the search process, saving them considerable time and effort in identifying courses of interest.

4.1.2. Social Information Filtering

Students expressed a desire for the AI-based CRS to effectively filter out unreliable social information pertaining to courses. While many students (20 out of 24) rely on social cues such as word of mouth or peer assessments when searching for courses, they harbor concerns regarding the reliability of such information. For instance, one student (S6) voiced apprehension, noting that while some students provide genuine course reviews, others may offer unhelpful or misleading information. Additionally, student S21 cautioned against bias in social information, highlighting instances where negative assessments might stem from personal grievances rather than objective evaluations. Students anticipated that if the AI-based CRS could effectively screen out untrustworthy social information, they would be less likely to be misled during their course searches.

4.1.3. Course-Related Text Visualization

Students expressed a desire for the AI-based CRS to utilize natural language processing technology to analyze course-related text information and present the results in a visual format. Many students (14 out of 24) voiced concerns about the overwhelming volume of textual information they encountered during the course-search process, including details such as course objectives, teaching methods, assessment criteria, and prerequisites. One student (S14) remarked, “It is difficult to comprehend course-related information because most of it consists of lengthy text passages”. In response to this challenge, another student (S20) suggested that if AI could transform complex textual information into visual representations such as tables or figures, it would significantly facilitate the course-search process.

4.2. Evaluation Phase

In the course-selection decision-making process, the evaluation phase involves students assessing the courses in which they intend to enroll and weighing the costs and benefits based on the information gathered during the search phase [1]. During this phase, students envisioned the AI-based CRS serving as a benchmark to quantitatively compare the suitability, value, and serendipitous elements of courses that are challenging to evaluate independently.

4.2.1. Course Suitability Analysis

Students expressed a desire to quantitatively compare the suitability of various courses selected during the search phase across multiple dimensions. Several students (11 out of 24) emphasized that they did not want a course to be recommended solely based on its popularity among peers or its high ratings from historical data. Instead, they wished for the AI-based CRS to quantitatively evaluate the suitability of courses for them based on factors such as their background knowledge (S20), the instructor’s characteristics (S19), preferred lecture style (S14), and personal learning preferences (S15). In particular, student S20 believed that a quantitative analysis of these qualitative factors would greatly assist in making objective decisions about which course to select from the available options.

4.2.2. Value Proposition

Even if a course may not align perfectly with their immediate needs, students expressed a desire to receive recommendations for courses that hold value for their future careers or skill enhancement. Some students (9 out of 24) highlighted the challenge of quantitatively assessing the value of specific courses. Consequently, students emphasized the importance of AI-based CRSs being able to quantify the potential benefits of a particular course for their future career trajectories and skill development. For example, student S8 remarked, “Even if I have a clear career goal, it’s difficult to determine which courses are necessary for staying competitive. In this regard, it would be beneficial if AI could guide me on the competencies I can gain from specific courses”. Regarding skill enhancement, student S4 noted, “If AI can indicate the value of prerequisite courses essential for advanced upper-level courses, it would aid in selecting courses at lower levels”. Students anticipated that this value proposition would assist them in making informed decisions, even if the recommended courses were not their top preferences.

4.2.3. Serendipity

Students expressed a desire for the AI-based CRS to consider unexpected enjoyment, known as serendipity, when evaluating courses. Several students (13 out of 24) emphasized the importance of not only evaluating the suitability and value of a course but also considering serendipitous opportunities. For instance, students anticipated that serendipitous course recommendations could “provide an opportunity to expand one’s interests and explore new areas (S12)” or even facilitate “the discovery of previously unknown areas of interest (S16 and S21)”.
In conclusion, students anticipate that AI-based CRSs will assist them in navigating extensive course lists by offering personalized course searches through features like retrieving, filtering, and visualizing high-quality course-related information. Additionally, students envision AI-based CRSs playing a pivotal role in facilitating their final course-selection decisions by evaluating and comparing nuanced factors such as suitability, learning value, and serendipitous enjoyment of potential courses. The theoretical and practical implications of these findings will be elaborated upon in the following section.

4.3. Students’ Preferences

Table 5 illustrates the distribution of students’ preferences for various AI-based CRSs based on their responses to the question: “Which AI-based CRS do you think will help students’ course-selection decision-making?” Notably, approximately half of the students (11 out of 24) preferred the Career interest-based CRS, with the Community-based CRS ranking a close second, chosen by 10 students. Both the Academic interest-based CRS and Serendipity-based CRS were selected by 8 students each, whereas the Grade prediction-based CRS was the least favored, chosen by only 5 students. It is important to highlight that participants had the option to choose more than one CRS, resulting in a total of 42 preferences, which exceeds the actual number of participants (24).

5. Discussion and Conclusions

In this study, we explored how AI-based CRSs influence students’ course-selection process to identify the most effective support for their decision-making. In the search phase, students anticipated the AI-based CRS to retrieve, filter, and visualize course-related information to facilitate personalized course searches. In the evaluation phase, students expected the AI-based CRS to quantify difficult-to-evaluate qualitative information, including course suitability, learning value, and serendipity. These findings underscore the variability in students’ interactions with AI-based CRSs based on the particular phase of their course-selection decision-making process.
Interestingly, students anticipated a change in the role of the AI-based CRS corresponding to each phase of the course-selection decision-making process, namely the search and evaluation phases. For instance, research by Chang and colleagues [45] highlights a personalized hybrid CRS designed to dynamically adapt to the diverse needs of students throughout their course-selection journey. During the search phase, students envisioned the AI-based CRS playing an assistive role in gathering course-related information. This limited assistive role arose from concerns regarding the AI’s ability to fully grasp complex factors, such as academic interests and preferences, based solely on semiannual course history data. Students expressed skepticism about fully automated recommendations during this phase, fearing they might not accurately reflect their evolving academic contexts, reminiscent of the ‘cold-start problem’ in AI. Students believed that the search phase could be improved when the AI-based CRS assisted by retrieving, filtering, and visualizing relevant data from diverse, text-heavy sources, with students taking the lead in utilizing this information.
Conversely, during the evaluation phase, students envisioned the AI-based CRS taking on a more prominent role, serving as a benchmark for quantitatively analyzing difficult-to-evaluate information. This expectation stemmed from their difficulty in objectively weighing the advantages and disadvantages of various courses, which often eroded their confidence. Research by Chong and colleagues [46] suggests that individuals tend to lean more on AI recommendations when their confidence in decision-making wanes. Consequently, students who felt overwhelmed during the evaluation phase tended to rely more on AI suggestions, expecting the AI-based CRS to assume a primary role. Students believed that AI’s quantified assessments would assist them in overcoming uncertainties in assessing course suitability, learning value, and serendipity. In summary, to effectively and efficiently support students in course selection, the AI-based CRS must adapt its role throughout the students’ decision-making process, acting as an assistant during the search phase and as a leader in the evaluation phase.
As indicated in Table 5, the distribution of students’ preferences shows no discernible pattern, reflecting the diverse backgrounds and unique needs of each student. This observation underscores the significance of recognizing individual variations in preferences and needs when providing personalized course recommendations rather than adopting a one-size-fits-all approach. Consequently, this study emphasizes the necessity for future AI-based CRSs to incorporate these insights, enabling them to offer tailored course selections that align with the specific preferences and requirements of each student.

5.1. Theoretical Implications

This study offers theoretical insights into how students can cultivate a symbiotic relationship with AI-based CRSs throughout their course-selection decision-making process, encompassing both the search and evaluation phases [1]. While many studies have delved into the implementation of AI-based CRSs in university settings, most systems are tailored to the specific needs of individual institutions, leaving a gap in understanding the broader support required for student course-selection decisions. Setting itself apart from other technology-centric research, this study underscores the importance of AI-based CRSs adapting their function to align with each phase of the course-selection process, thus fostering effective collaboration between humans and AI [20]. Future research endeavors should explore how AI-based CRSs can recognize different stages in a student’s course-selection journey and adjust their role accordingly to enhance decision-making support.

5.2. Practical Implications

This study, taking into account advancements in LLMs like ChatGPT, presents several practical implications. Our research highlights the need for AI systems to go beyond merely reacting to explicit student requests and to engage proactively during both the search and evaluation phases to provide genuinely personalized support. First, in the course-search phase, AI-based CRSs could benefit from offering a chat-based interactive platform to assist students. Rather than simply presenting a list of courses, the system could engage in conversational interactions to elucidate course topics, therefore aiding students in autonomously discovering courses that align with their interests. Through these narrative-driven recommendations [17], the AI would serve an assistive role by guiding students through sifting through and rectifying misinformation, highlighting crucial aspects of courses, and summarizing key information, thus empowering students to navigate their course search autonomously.
Second, during the evaluation phase, the AI-based CRS should take on a more leading role, prioritizing explainability. Providing transparent and comprehensible explanations throughout the AI-assisted decision-making process can mitigate uncertainties and bolster confidence in the system [47]. For instance, in the scenario involving a Career interest-based CRS, the AI could explain to students: “This course is recommended because it aligns with your career interests in machine learning engineering, which could significantly benefit your future career path based on your current academic and extracurricular profile”. As students interact with the AI-based CRS’s reasoning during evaluations, they will have the chance to develop their own perspectives and improve their capacity to assess and select courses autonomously [48].
Thirdly, AI-based CRSs should provide unexpected or serendipitous course suggestions. This strategy tackles the filter bubble problem by urging students, especially those uncertain about their academic path, to venture into new domains [48]. These recommendations have the potential to expand students’ perspectives, exposing them to unfamiliar subjects and revealing interests they may not have previously considered. Future research endeavors should strive to identify the key characteristics of serendipitous recommendations and assess the capacity of LLMs to deliver such suggestions reliably and precisely, avoiding the propagation of misleading or erroneous information.
There are several limitations that should be considered when interpreting our findings. First, the outcomes may vary based on participants’ socio-demographic factors, such as their university affiliations, countries of residence, or other specific conditions. Future research should explore how these factors influence students’ preferences and expectations regarding AI-based CRSs. Additionally, characteristics of the courses themselves, such as the academic domain, might affect the results. For instance, further studies could examine how CRS designs need to be customized for different fields, such as Engineering versus Medicine, to enhance their applicability and effectiveness. Second, while this research encompassed various types of AI-based CRS, further exploration into additional types of systems could provide a more comprehensive understanding of how students navigate their course-selection decisions. Thirdly, although speed dating with storyboards proved effective in assessing user experience, it may not fully capture the intricacies of real-time interactions with AI-based CRSs in authentic educational environments. Therefore, future research endeavors should focus on developing and empirically testing genuine AI-based CRS models. Lastly, this study does not address the evolution of students’ expectations regarding AI-based CRSs over time. As students become more accustomed to interacting with AI in their course-selection process, their expectations and reliance on AI may change. Future studies should investigate how the relationship between students and AI in the course-selection process evolves over extended durations.
Despite the outlined limitations, this study has yielded intriguing insights into students’ expectations regarding AI-based CRSs during their course-selection decision-making process, encompassing both the search and evaluation phases. During the search phase, students prefer the AI-based CRS to play an assisting role, aiding in retrieving, filtering, and visualizing complex data to streamline exhaustive search tasks. Conversely, in the evaluation phase, students envision the AI-based CRS assuming a more leading role, serving as a benchmark for quantifying difficult-to-evaluate factors such as course suitability, learning value, and the potential for serendipitous discoveries. These findings shed light on the dynamic nature of student interactions with AI-based CRSs, underscoring the importance of these systems adapting their roles to align with each phase of the course-selection decision-making process. Leveraging advanced LLMs, future AI-based CRSs can be tailored to meet these student expectations, potentially fostering a cohesive and effective partnership between students and AI in the realm of course-selection decision-making.

Author Contributions

Conceptualization, S.C. and K.S.; methodology, S.C. and K.S.; validation, S.C., M.L. and K.S.; investigation, S.C.; resources, K.S.; data curation, M.L.; writing—original draft preparation, S.C.; writing—review and editing, M.L. and K.S.; visualization, S.C.; supervision, K.S.; project administration, K.S. All authors have read and agreed to the published version of the manuscript.

Funding

This study was financially supported by Seoul National University of Science and Technology (grant number n/a).

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki and approved by the Institutional Review Board (or Ethics Committee) of Seoul National University of Science and Technology on 17 January 2023, approval number 2022-0020-01.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The original contributions presented in the study are included in the article. Further inquiries can be directed to the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Othman, M.H.; Mohamad, N.; Barom, M.N. Students’ decision making in class selection and enrolment. Int. J. Educ. Manag. 2019, 33, 587–603. [Google Scholar] [CrossRef]
  2. Morrow, T. Personalizing Education with Algorithmic Course Selection; Missouri University of Science and Technology: Rolla, MO, USA, 2017. [Google Scholar]
  3. Mourey, J.A.; Markley, M.M.; Koernig, S.K. Dazzling descriptions and tantalizing titles: How simple versus complex course information influences course selection. J. Mark. Educ. 2022, 44, 100–112. [Google Scholar] [CrossRef]
  4. Li, Q.; Kim, J. A deep learning-based course recommender system for sustainable development in education. Appl. Sci. 2021, 11, 8993. [Google Scholar] [CrossRef]
  5. Kim, T.; Lim, J. Developing an Intelligent Recommendation System for Non-Information and Communications Technology Major University Students. Appl. Sci. 2023, 13, 12774. [Google Scholar] [CrossRef]
  6. Wu, B.; Liu, L. Personalized Hybrid Recommendation Algorithm for MOOCs Based on Learners’ Dynamic Preferences and Multidimensional Capabilities. Appl. Sci. 2023, 13, 5548. [Google Scholar] [CrossRef]
  7. Wu, Z.; Liang, Q.; Zhan, Z. Course Recommendation Based on Enhancement of Meta-Path Embedding in Heterogeneous Graph. Appl. Sci. 2023, 13, 2404. [Google Scholar] [CrossRef]
  8. Guruge, D.B.; Kadel, R.; Halder, S.J. The state of the art in methodologies of course recommender systems—A review of recent research. Data 2021, 6, 18. [Google Scholar] [CrossRef]
  9. Boratto, L.; Fenu, G.; Marras, M. The effect of algorithmic bias on recommender systems for massive open online courses. In European Conference on Information Retrieval; Springer: Cham, Switzerland, 2019; pp. 457–472. [Google Scholar]
  10. Natarajan, S.; Vairavasundaram, S.; Natarajan, S.; Gandomi, A.H. Resolving data sparsity and cold start problem in collaborative filtering recommender system using linked open data. Expert Syst. Appl. 2020, 149, 113248. [Google Scholar] [CrossRef]
  11. Lika, B.; Kolomvatsos, K.; Hadjiefthymiades, S. Facing the cold start problem in recommender systems. Expert Syst. Appl. 2014, 41, 2065–2073. [Google Scholar] [CrossRef]
  12. Bang, J.; Lee, B.T.; Park, P. Examination of Ethical Principles for LLM-Based Recommendations in Conversational AI. In Proceedings of the 2023 International Conference on Platform Technology and Service (PlatCon), Busan, Republic of Korea, 16–18 August 2023; pp. 109–113. [Google Scholar]
  13. Acharya, A.; Singh, B.; Onoe, N. LLM Based Generation of Item-Description for Recommender system. In Proceedings of the 17th ACM Conference on Recommender Systems, Singapore, 18–22 September 2023; pp. 1204–1207. [Google Scholar]
  14. Gao, Y.; Sheng, T.; Xiang, Y.; Xiong, Y.; Wang, H.; Zhang, J. Chat-rec: Towards interactive and explainable LLM-augmented recommender system. arXiv 2023, arXiv:2303.14524. [Google Scholar]
  15. Yin, B.; Xie, J.; Qin, Y.; Ding, Z.; Feng, Z.; Li, X.; Lin, W. Heterogeneous Knowledge Fusion: A Novel Approach for Personalized Recommendation via LLM. In Proceedings of the 17th ACM Conference on Recommender Systems, Singapore, 18–22 September 2023; pp. 599–601. [Google Scholar]
  16. Sanner, S.; Balog, K.; Radlinski, F.; Wedin, B.; Dixon, L. Large Language Model are Competitive Near Cold-start Recommenders for Language-and Item-based Preferences. In Proceedings of the 17th ACM Conference on Recommender Systems, Singapore, 18–22 September 2023; pp. 890–896. [Google Scholar]
  17. Mysore, S.; McCallum, A.; Zamani, H. Large Language Model Augmented Narrative Driven Recommendations. arXiv 2023, arXiv:2306.02250. [Google Scholar]
  18. Rastogi, C.; Tulio Ribeiro, M.; King, N.; Nori, H.; Amershi, S. Supporting human-ai collaboration in auditing LLM with LLM. In Proceedings of the 2023 AAAI/ACM Conference on AI, Ethics, and Society, Montréal, QC, Canada, 8–10 August 2023; pp. 913–926. [Google Scholar]
  19. Wu, T.; Terry, M.; Cai, C.J. Ai chains: Transparent and controllable human-ai interaction by chaining large language model prompts. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems, New Orleans, LA, USA, 30 April–5 May 2022; pp. 1–22. [Google Scholar]
  20. Holstein, K.; Aleven, V. Designing for human–AI complementarity in K-12 education. AI Mag. 2022, 43, 239–248. [Google Scholar] [CrossRef]
  21. Rivera, A.C.; Tapia-Leon, M.; Lujan-Mora, S. Recommendation systems in education: A systematic mapping study. In International Conference on Information Technology & Systems; Springer: Cham, Switzerland, 2018; pp. 937–947. [Google Scholar]
  22. Zimmerman, J.; Forlizzi, J. Speed dating: Providing a menu of possible futures. She Ji J. Des. Econ. Innov. 2017, 3, 30–50. [Google Scholar] [CrossRef]
  23. Jin, S.H.; Im, K.; Yoo, M.; Roll, I.; Seo, K. Supporting students’ self-regulated learning in online learning using artificial intelligence applications. Int. J. Educ. Technol. High. Educ. 2023, 20, 37. [Google Scholar] [CrossRef]
  24. Nowell, L.S.; Norris, J.M.; White, D.E.; Moules, N.J. Thematic analysis: Striving to meet the trustworthiness criteria. Int. J. Qual. Methods 2017, 16, 1609406917733847. [Google Scholar] [CrossRef]
  25. Babad, E. Students’ course selection: Differential considerations for first and last course. Res. High. Educ. 2001, 42, 469–492. [Google Scholar] [CrossRef]
  26. Babad, E.; Tayeb, A. Experimental analysis of students’ course selection. Br. J. Educ. Psychol. 2003, 73, 373–393. [Google Scholar] [CrossRef] [PubMed]
  27. Towers, A.; Towers, N. Re-evaluating the postgraduate students’ course selection decision making process in the digital era. Stud. High. Educ. 2020, 45, 1133–1148. [Google Scholar] [CrossRef]
  28. Seo, K.; Dodson, S.; Harandi, N.M.; Roberson, N.; Fels, S.; Roll, I. Active learning with online video: The impact of learning context on engagement. Comput. Educ. 2021, 165, 104132. [Google Scholar] [CrossRef]
  29. Coleman, J.; McKeachie, W.J. Effects of instructor/course evaluations on student course selection. J. Educ. Psychol. 1981, 73, 224. [Google Scholar] [CrossRef]
  30. Deschênes, M. Recommender systems to support learners’ Agency in a Learning Context: A systematic review. Int. J. Educ. Technol. High. Educ. 2020, 17, 50. [Google Scholar] [CrossRef]
  31. Zawacki-Richter, O.; Marín, V.I.; Bond, M.; Gouverneur, F. Systematic review of research on artificial intelligence applications in higher education–where are the educators? Int. J. Educ. Technol. High. Educ. 2019, 16, 39. [Google Scholar] [CrossRef]
  32. Xu, L.; Pardos, Z.A.; Pai, A. Convincing the Expert: Reducing Algorithm Aversion in Administrative Higher Education Decision-making. In Proceedings of the Tenth ACM Conference on Learning@ Scale, Copenhagen, Denmark, 20–22 July 2023; pp. 215–225. [Google Scholar]
  33. Jena, K.K.; Bhoi, S.K.; Malik, T.K.; Sahoo, K.S.; Jhanjhi, N.Z.; Bhatia, S.; Amsaad, F. E-learning course recommender system using collaborative filtering models. Electronics 2023, 12, 157. [Google Scholar] [CrossRef]
  34. Niu, Y.; Lin, R.; Xue, H. Research on Learning Resource Recommendation Based on Knowledge Graph and Collaborative Filtering. Appl. Sci. 2023, 13, 10933. [Google Scholar] [CrossRef]
  35. Chen, Y.; Fu, A.; Lee, J.J.L.; Tomasik, I.W.; Kizilcec, R.F. Pathways: Exploring Academic Interests with Historical Course Enrollment Records. In Proceedings of the Ninth ACM Conference on Learning@ Scale, New York, NY, USA, 1–3 June 2022; pp. 222–233. [Google Scholar]
  36. Morsomme, R.; Alferez, S.V. Content-Based Course Recommender System for Liberal Arts Education. In Proceedings of the 12th International Conference on Educational Data Mining, Montreal, QC, Canada, 2–5 July 2019; pp. 748–753. [Google Scholar]
  37. Clemente, J.; Yago, H.; de Pedro-Carracedo, J.; Bueno, J. A proposal for an adaptive Recommender System based on competences and ontologies. Expert Syst. Appl. 2022, 208, 118171. [Google Scholar] [CrossRef]
  38. Yago, H.; Clemente, J.; Rodriguez, D. Competence-based recommender systems: A systematic literature review. Behav. Inf. Technol. 2018, 37, 958–977. [Google Scholar] [CrossRef]
  39. Premalatha, M.; Viswanathan, V.; Čepová, L. Application of semantic analysis and LSTM-GRU in developing a personalized course recommendation system. Appl. Sci. 2022, 12, 10792. [Google Scholar] [CrossRef]
  40. Pardos, Z.A.; Jiang, W. Designing for serendipity in a university course recommender system. In Proceedings of the Tenth International Conference on Learning Analytics & Knowledge, Frankfurt am Main, Germany, 25–27 March 2020; pp. 350–359. [Google Scholar]
  41. Lee, J.S.; Moon, K.B.; Han, S.Y.; Lee, S.K.; Kwon, H.J.; Han, J.H.; Kim, G.T. Development and application of an AI-Powered adaptive course recommender system in higher education: An example from K university. J. Educ. Technol. 2021, 37, 267–307. [Google Scholar] [CrossRef]
  42. Seo, K.; Tang, J.; Roll, I.; Fels, S.; Yoon, D. The impact of artificial intelligence on learner–instructor interaction in online learning. Int. J. Educ. Technol. High. Educ. 2021, 18, 54. [Google Scholar] [CrossRef]
  43. Linsey, J.S.; Becker, B. Effectiveness of brainwriting techniques: Comparing nominal groups to real teams. In Design Creativity 2010; Springer: Berlin/Heidelberg, Germany, 2011; pp. 165–171. [Google Scholar]
  44. Park, M.; Seo, K. Passive vs. Active: Which Behavior Is Better for Predicting At-Risk Students in Online Learning? In Proceedings of the Korean HCI Society Annual Conference, Kangwon, Republic of Korea, 1–3 February 2023; pp. 815–821. [Google Scholar]
  45. Chang, H.T.; Lin, C.Y.; Jheng, W.B.; Chen, S.H.; Wu, H.H.; Tseng, F.C.; Wang, L.C. AI, Please Help Me Choose a Course. Educ. Technol. Soc. 2023, 26, 203–217. [Google Scholar]
  46. Chong, L.; Zhang, G.; Goucher-Lambert, K.; Kotovsky, K.; Cagan, J. Human confidence in artificial intelligence and in themselves: The evolution and impact of confidence on adoption of AI advice. Comput. Hum. Behav. 2022, 127, 107018. [Google Scholar] [CrossRef]
  47. Zhang, Y.; Liao, Q.V.; Bellamy, R.K. Effect of confidence and explanation on accuracy and trust calibration in AI-assisted decision making. In Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency, Barcelona, Spain, 27–30 January 2020; pp. 295–305. [Google Scholar]
  48. Ma, B.; Lu, M.; Taniguchi, Y.; Konomi, S.I. CourseQ: The impact of visual and interactive course recommendation in university environments. Res. Pract. Technol. Enhanc. Learn. 2021, 16, 18. [Google Scholar] [CrossRef] [PubMed]
Figure 1. A storyboard example of Scenario 4, Grade-prediction-based CRS in Table 2.
Figure 1. A storyboard example of Scenario 4, Grade-prediction-based CRS in Table 2.
Applsci 14 03672 g001
Figure 2. Overview of experimental procedure.
Figure 2. Overview of experimental procedure.
Applsci 14 03672 g002
Table 1. Three phases of the course-selection decision-making process of university students adapted from Othman et al. (2019) in ref. [28].
Table 1. Three phases of the course-selection decision-making process of university students adapted from Othman et al. (2019) in ref. [28].
Course-Selection Decision-Making ProcessDescription
Search phaseStudents search for information regarding the course through their peers, seniors, and word of mouth
Evaluation phaseStudents evaluate which course they will enroll in, considering the costs and benefits based on the information they have gathered in the search phase
Choice/decision phaseStudents make choices depending on the information that they have evaluated in the evaluation phase
Table 2. AI-based CRS scenario titles and summaries.
Table 2. AI-based CRS scenario titles and summaries.
IDScenario TitleScenario Summary
1Community-based CRSAI recommends courses using implicit and explicit feedback from students’ assessments of the courses
2Academic interest-based CRSAI recommends courses whose content most closely matches students’ academic interests
3Career interest-based CRSAI recommends pertinent skills and relevant courses based on students’ career interests
4Grade prediction-based CRSAI recommends top-n courses in which students would perform well by predicting grades in the next semester based on their course enrollment patterns
5Serendipity-based CRSAI recommends serendipitous courses that are novel or unexpected but still relevant to individual students’ interests
Table 3. Summary of the students’ information.
Table 3. Summary of the students’ information.
IDMajorYear of StudyAgeGender
S1Computer Science120M
S2Biochemical Engineering424F
S3Electronics Engineering223M
S4Information Engineering426M
S5Computer Science120F
S6Electronics224M
S7Information Science122F
S8Information Engineering425M
S9Electronics224M
S10Electronics121F
S11Physics120M
S12Statistics223M
S13Russian120F
S14Education322F
S15Media and Arts222F
S16Computer Science122M
S17Airlines Service222F
S18Information Security322F
S19Economics322F
S20Media Engineering322F
S21Management Engineering222F
S22Education222F
S23Law322F
S24Nursing221F
Table 4. Students’ expectations and response to the AI-based CRSs in each phase of the course-selection decision-making process.
Table 4. Students’ expectations and response to the AI-based CRSs in each phase of the course-selection decision-making process.
Course-Selection Decision-Making ProcessStudents’ Expectations for
AI-Based CRS
Summary of Students’ Responses
Search phaseSubject-based course retrievalStudents reported the need for ‘subject-based course retrieval’ to make it easier to find courses they want to search for
Social information filteringStudents responded that they would like the CRS to filter out untrustworthy social information related to courses
Course-related text visualizationStudents wanted the CRS to use natural language processing technology to analyze course-related text information and visualize the results
Evaluation phaseCourse suitability analysisStudents hoped to be able to quantitatively compare courses suitable for them in various aspects among the courses selected through the search phase
Value propositionStudents wanted to be recommended a course if it was valuable to them for their future careers or to improve their skills
SerendipityStudents wanted the CRS to consider unexpected fun (i.e., serendipity) when evaluating courses
Table 5. Students’ preferences for different types of AI-based CRS.
Table 5. Students’ preferences for different types of AI-based CRS.
IDScenario TitlePreferred StudentsTotal
1Community-based CRSS1, S2, S3, S5, S7, S11, S13, S16, S22, S2410
2Academic interest-based CRSS1, S7, S8, S12, S14, S15, S22, S238
3Career interest-based CRSS2, S3, S4, S5, S6, S11, S13, S14, S16, S18, S2311
4Grade prediction-based CRSS1, S7, S10, S17, S185
5Serendipity-based CRSS6, S9, S10, S16, S19, S20, S21, S248
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Cha, S.; Loeser, M.; Seo, K. The Impact of AI-Based Course-Recommender System on Students’ Course-Selection Decision-Making Process. Appl. Sci. 2024, 14, 3672. https://0-doi-org.brum.beds.ac.uk/10.3390/app14093672

AMA Style

Cha S, Loeser M, Seo K. The Impact of AI-Based Course-Recommender System on Students’ Course-Selection Decision-Making Process. Applied Sciences. 2024; 14(9):3672. https://0-doi-org.brum.beds.ac.uk/10.3390/app14093672

Chicago/Turabian Style

Cha, Seungeon, Martin Loeser, and Kyoungwon Seo. 2024. "The Impact of AI-Based Course-Recommender System on Students’ Course-Selection Decision-Making Process" Applied Sciences 14, no. 9: 3672. https://0-doi-org.brum.beds.ac.uk/10.3390/app14093672

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop