"

Generative AI and DCU Business School

Business student using chatgpt

 

Institution: Dublin City University

Discipline: Business

Authors: Robert Gillanders, Shadi Karazi, Silvia Rocchetta, Gary Sinclair, Suzanne Stone

GenAI tool (s) used: ChatGPT

 

Situation / Context

DCU Business School constitutes a wide variety of disciplines including economics, management, marketing, finance, accounting, entrepreneurship, human resources, data analytics and more. We asked students from three different DCU Business School modules to develop a context-specific question for ChatGPT and to review the essay work it provided.

Two modules were from the economics discipline with relatively small numbers of students (less than 30), and one was a large marketing class with 80 students. The economics modules were final year advanced classes while the marketing module was a second-year course.

Task / Goal

The emerging literature argues that integrating Generative Artificial Intelligence (GenAI) into a business school curriculum has the potential to enhance learning by enabling students to pose questions, engage in reflective thinking, challenge existing knowledge, and foster discussions about answers. This integration can lead to increased student engagement and interaction with the subject matter.

As we all know, Artificial Intelligence (AI) poses threats to the integrity of many commonly used forms of continuous assessment. Alternative assessments such as oral examinations and reflexive work that are frequently flagged as solutions are just not possible at the scale required for a large number of modules, with high staff-student ratios. Hence, we needed to think about how to use the technology in ways that suited disciplinary needs and large class sizes.

We were trying to “build in” the new technology and design an authentic assessment that would engage students and meet the learning goals. DCU Business School has a strong commitment to developing critical thinking and creativity skills, and we also hoped that the assessment would build on and help develop these skills further by requiring students to think carefully about the course material and arrive at questions that pushed beyond the boundaries of the set curriculum.

Actions / Implementation

Students were tasked with posing questions to ChatGPT and evaluating the output from the artificial intelligence software. This varied somewhat by discipline. The economics students were told to set a question that ChatGPT would answer. Examples of the prompts were given such as “write a long essay on ___________. Include 10 in-text academic references. List the references at the end.” The question was submitted via our online learning platform with a 500-word explanation of why the question was important and worth asking. This explanation was to draw on their knowledge of economics in general and of the module in question in particular.

Half of the grade for the assignment was for an interesting and well-motivated question. Examples were provided at the start of the module and throughout the lectures. The rest of the grade went for an assessment of the AI’s output. The output was generated by the professor taking the class and sent to students to ensure that there were no privacy concerns. Students were told to assess the output on grounds of accuracy, relevance, referencing, and depth of argument. Written comments were to be added to the Word file and submitted online. Examples of output were provided at the start of the course with annotated critiques of the output and an overall assessment and grade for the essay.

Students in the second-year marketing module were instructed to use ChatGPT to develop an introductory understanding of the concept of social marketing. They were asked to write a one-page reflection on their use of ChatGPT. This assessment was set relatively quickly after ChatGPT was released. Consequently, it was the first experience for most students to use the tool. Following submission, students were asked to share their experiences in class. Emphasis was placed on weaknesses of the tool, when it was appropriate to use for assessment and how to best develop knowledge of a particular topic. Students were provided with guidance on how to use it in future assessments for the module.

Outcomes

We collected both quantitative and qualitative data from participants through an anonymous online survey designed using the Google Forms survey tool. Ethics approval was granted by DCU Research Ethics Committee. The survey questions comprised quantitative Likert questions and open questions designed to gather qualitative data. Survey participants were invited to complete the survey by email through a gatekeeper (Teaching Enhancement Unit) to avoid any pressure on students to participate, in line with research ethics guidelines. The survey was designed to minimise the time commitment for participants with an approximate time of 15 minutes to complete. 29 students completed the survey.

Eighty-six percent of survey respondents agreed that it helped them understand how Chat GPT works but there was a mixed response of opinions on whether it deepened their understanding of course material (52%) and developed their critical thinking skills (59%). Other findings were that:

  • 86% liked that ChatGPT was integrated into the module;
  • 86% agreed it helped them understand how ChatGPT works;
  • 62% said that they were more inclined to use ChatGPT in other modules after this assessment;
  • 83% agreed it was an engaging mode of assessment;
  • 52% agreed it deepened their understanding of course material;
  • 45% agreed it developed essay writing skills;
  • 59% agreed it developed critical thinking skills.

The qualitative responses were also mixed, reflecting a desire to learn more about the technology but also some anxiety about when and how GenAI could be used:

  • “It was helpful in that I developed a better understanding of what lecturers look for when marking assignments.”
  • “I think getting the opportunity to grade a paper may really improve our own essay writing skills as we learn what and what not to do.”
  • “It was fun for it to be integrated with the module, because its proactively moving with the technology era we are in, and I do believe it is important that all students know about the AI, so that nobody has an advantage.”
  • “Interesting to see where technology is but important to note that it isn’t perfect.”
  • “I liked that I could get help from an innovative AI tool to write my essay better and to gather useful articles that it suggested. What I don’t like is the morals behind using this service, in particular with students. Students will take advantage of using ChatGPT and this is unfair to other students who are not using it.”
  • “I disliked that [I didn’t know to] what degree I could use it, I didn’t want to overuse it in fear of being told I had used it too much and would be penalised.”

Reflections

These results are encouraging as it is important that students learn about the limitations of GenAI, even if more work is needed to ensure that the technology works to deepen understanding and critical thinking skills.

What was clear in this research is that students need to develop better prompt engineering skills. Cotton et al., (2023) suggest that such questions need to be personalised, and Lim et al., (2023) place emphasis on ensuring students engage in fact-checking and verification for information produced by GenAI. We learned that toolkits need to be developed in which to guide students regarding the use of prompts and the follow-up fact-checking processes that are required to use such tools appropriately and effectively.

In terms of the quality of the work, in the economics modules, many of the questions were of an extremely poor quality. Many were very poorly written while others were not grounded in theory or the class material in any meaningful way. A small but appreciable number were simply questions that had been covered in class.

The evaluation of the output, however, was generally very detailed and went beyond noting the false references. The conclusion was that students were good at evaluating the quality of an argument but needed additional guidance in formulating and motivating a novel question that pushed beyond what they had been taught.

In subsequent years, the assessment was used again in economics modules but was rebalanced so that the weight for the question was two-thirds of the grade and the evaluation one-third. More time in class was given to discussing what constituted a good question and students were strongly encouraged to discuss their plans with professors during office hours. This led to a significant increase in the quality of the questions.

Overall, guidance cannot be left to individual module coordinators. The approach to generative AI within the classroom has to be formalised at an institutional level to avoid confusion amongst students and faculty. Clear policies and toolkits need to be formulated.

 

Further Reading

Cotton, D. R., Cotton, P. A., & Shipway, J. R. (2024). Chatting and cheating: Ensuring academic integrity in the era of ChatGPT. Innovations in education and teaching international61 (2), 228-239.

Lim, W. M., Gunasekara, A., Pallant, J. L., Pallant, J. I., & Pechenkina, E. (2023). Generative AI and the future of education: Ragnarök or reformation? A paradoxical perspective from management educatorsThe international journal of management education21 (2), 100790.

 

Author Biographies

Robert Gillanders is a Professor of Economics at Dublin City University Business School and Co-Director of the DCU Anti-corruption Research Centre (DCU ARC). His main research focus is the causes and consequences of corruption. He is also interested in how institutions influence enterprises and the business environment. He holds a senior fellowship from Advance HE.

Shadi Karazi is a Senior Learning Technologist at Dublin City University Business School, specializing in the strategic development and enhancement of technology-enabled learning (TEL) to advance the school’s mission. With a strong focus on emerging learning technologies, TEL innovation, and educational development, he brings a research-driven approach to transforming digital education. He holds a PhD in Engineering, an MSc in eLearning, and is a Fellow of Advance HE.

Silvia Rocchetta is an Assistant Professor of Economics at DCU Business School. Her current projects mainly focus on the regional knowledge space evolution, adaptive resilience, technological change, industry lifecycle, innovation, and inequality.

Gary Sinclair is an Associate Professor of Marketing and the Associate Dean of Internationalisation at Dublin City University Business School. His research mostly focuses on consumer behaviour, specifically online harms in sport and consumer culture.

Dr Suzanne Stone is Universal Design for Learning Lead at the University of Limerick. Her research interests include: the scholarship of teaching and learning; universal design for learning; virtual learning environments;  digital wellbeing; and digital assessment.

License

Icon for the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License

Using GenAI in Teaching, Learning and Assessment in Irish Universities Copyright © 2025 by Dr Ana Elena Schalk Quintanar (Editor) and Dr Pauline Rooney (Editor) is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License, except where otherwise noted.