Fusion or Confusion? Co-Creating Assessments with GenAI and Students

Gerhard Kristandl

As an educator with nearly two decades of experience, I have witnessed numerous changes in higher education. The recent integration of Generative AI (GenAI) into our learning and teaching practices stands out as particularly transformative. In this post, I will share my experiences from implementing GenAI-assisted collaborative assessment creation my two undergraduate modules during the 2023/24 academic year. Fusion or Confusion? Let’s find out!

Rethinking My Assessments: A Scholarly Approach

The idea to co-create assessments with GenAI tools emerged from a critical evaluation of my 2022/23 modules, accompanied by the question whether I was still ‘assessing what I meant to assess’ (Villarroel et al., 2019). This reassessment revealed that some parts of my existing assessments had a low barrier to AI misuse; chatbot responses were not ‘perfect’ but sufficient to pass. This realization triggered a deeper examination of thought leadership on assessment design and AI integration in education, focusing on authentic assessment, co-creation, and assessing both process and final artifact. (Ellis, 2024)).

Authentic Assessment

Authentic assessment aims to evaluate students’ abilities in ‘real-world’ contexts. It requires students to use the same competencies, or combinations of knowledge and skills that they need to apply in their professional lives (Villarroel et al., 2017). In the context of AI-assisted learning, it can involve tasks where students critically analyze AI-generated content, mirroring the way they might interact with AI tools in their future careers (Zou et al., 2023).

For example, in my redesigned assessments, students described a sales process in a local business, then tasked the AI to do the same, and finally evaluated the AI-generated responses to improve their initial description. This approach tests their subject knowledge and critical engagement with AI tools, a competency that is becoming essential in many professions.

Co-creation

Co-creation in education has gained significant attention recently. Bovill and Woolmer (2019) argue that co-creation can enhance student engagement and lead to more meaningful learning experiences. Involving students in the design process can create a sense of ownership and responsibility for their learning.

In my modules, I implemented this by involving students in designing parts of their own assessment tasks. For instance, students selected a local small business they frequented regularly, described its sales process, and used AI tools to generate a business perspective, which they then critically evaluated. This process not only involved students in co-creating their learning materials but also developed their critical thinking skills in relation to AI-generated content.

Assessing Both Process and Final Artifact

The concept of assessing both the process and the final outcome, or “artifact,” is well-supported in educational literature and not a ‘new’ idea in response to the rise of GenAI. Bloxham and Boyd (2007) highlight the importance of considering the learning journey alongside the end product as well as the opportunities to provide feedback on both process and task level (Hattie & Timperley, 2007). This approach aligns with the idea of “assessment as learning,” where the assessment process itself becomes a learning opportunity.

In my modules, I implemented this concept by making explicit the steps students needed to take from task instructions to the final artifact. This approach helped students appreciate how the output of one step serves as the input to the next, while allowing me to evaluate where AI tools could facilitate learning. For example, in a costing methods case study, students engaged in AI interaction, critical discussion, project planning, and final outcome presentation, allowing comprehensive assessment of their skills and understanding. By enabling students to learn with AI and about AI concurrently, this approach helped showcase that AI-generated responses should never be accepted or used without human intervention.

This raises the barrier to AI misuse significantly, as I was able to gain insight into the process and not just see the final result. It further allowed me to support students in their learning journey in a much more targeted fashion, enabling me to provide ongoing feedback, and address issues before students completed the assessment. Overall, this approach provided a more comprehensive view of each student’s learning and skill development throughout.

By integrating these three concepts – authentic assessment, co-creation, and process-artifact assessment – into my redesigned modules, I aimed to create a more engaging, relevant, and effective learning experience for my students. This approach not only addresses many of the challenges posed by AI in higher education but also leverages its potential to enhance learning and learning outcomes.

A Before-and-After Look

Let’s summarize how a specific assessment on the ‘5 Generic Elements of an Accounting Information System’ (AIS) has changed thus:

AspectBefore (2022/23)After (2023/24)
Task FocusFact-based questions about AIS elementsAI-enhanced reflective case
Student RolePassive recipient of predetermined questionsActive co-creation of assessment and elements
Use of AINot explicitly addressed, with low barrier to misuse for ‘sufficient’ responsesIntegrated as a tool for initial responses and critical evaluation thereof
Critical ThinkingLimitedEnhanced through AI-generated content analysis
Learning OutcomesUnderstanding AIS elementsUnderstanding AIS elements & AI capabilities and limitations

This new approach:

  • Makes the process explicit, focusing on both the outcome and how students reach it.
  • Gives students agency and relevance, letting them engage with new technology meaningfully.
  • Boosts critical thinking by requiring students to evaluate and improve AI-generated content.
  • Provides a chance to learn both with and about AI.

How I Did It: Two Module Examples

I implemented this approach in two undergraduate modules: Accounting Information Systems and SAP ERP. I have briefly introduced my process earlier, but here it is step-by-step:

  1. I started with learning outcomes and identified the final artifact.
  2. I reflected on and outlined the steps towards the final artifact.
  3. I identified opportunities for meaningful (and ethical) AI usage within the assessment process.
  4. I provided clear instructions and robust support for students.

One thing is important here to note – the AI usage needs to be ‘meaningful’; its inclusion should avoid the impression to have been shoehorned in for the sake of appearing ‘trendy’, but rather that it is the right (or at least most adequate) tool to attempt a task or problem. At the same time, it can highlight to students when AI might not be useful, or even cause problems due to bias, hallucinations, or uncritical acceptance of generated responses.

What Worked Well & Challenges I Faced

My redesign had several positive outcomes:

  • Increased engagement: Students were initially sceptical, fearing trouble for using AI tools. After guided exposure, they found the process interesting and career-relevant.
  • Better understanding: Students said making the process explicit helped them grasp business concepts better. One student stated, “I had much more confidence attempting the tasks that way.”
  • Skill development: Critical thinking, digital literacy, and AI interaction skills improved.

These outcomes, while anecdotal, are encouraging and map a way forward for assessment redesign through the GenAI-lens. A highly engaged student shared, “I learned a lot because I didn’t have to guess what I needed to do, but also because I had a chance to learn how AI can help me complete my tasks, and I was in control of my work.”

However, I also encountered some hurdles:

  • Misconceptions about ‘digital natives’: Only a handful of students had tried GenAI tools before 2023/24 (chiming with Chan & Lee, 2023)
  • Students needed more support and guidance when using AI in their work.
  • Designing assessments required more upfront work.
  • Marking individualized outcomes (process steps and final artifact) took more effort.

Concluding Thoughts

It has become clear in 2024 that GenAI will play a bigger role in education moving forward. My experiments and experiences in co-creating assessments show that when used thoughtfully, these tools can lower the risk of misuse, improve student learning, increase engagement, and prepare students for a future where working with AI is common. Returning to the title of this blog post, ‘fusion’ between the creativity of educators and students, facilitated by GenAI, can lead to meaningful learning opportunities – and avoid the ‘confusion’.

Personally, I am on an ongoing exploration how to best use GenAI in my teaching practices, but the potential benefits make it worth doing so. As educators, we need to expose ourselves to this technology without fear, but with caution, so we are able to guide its integration into our practices, making sure technology enhances rather than replaces human creativity and critical thinking.

References

Bloxham, S. & Boyd, P. (2007). Developing Effective Assessment in Higher Education: A Practical Guide. McGraw-Hill Education (UK).

Bovill, C. & Woolmer, C., (2019). How conceptualisations of curriculum in higher education influence student-staff co-creation in and of the curriculum. Higher Education, 78(3), 407-422. https://doi.org/10.1007/s10734-018-0349-8

Chan, C.K.Y. & Lee, K.K.W., (2023). The AI generation gap: Are Gen Z students more interested in adopting generative AI such as ChatGPT in teaching and learning than their Gen X and Millennial Generation teachers? Smart Learn. Environ. 10, 60 (2023). https://doi.org/10.1186/s40561-023-00269-3

Ellis, C. & Lodge, J.M. (2024). Stop looking for evidence of cheating with AI and start looking for evidence of learning. LinkedIn, 8 July. Available at: https://www.linkedin.com/pulse/stop-looking-evidence-cheating-ai-start-learning-cath-ellis-h0zzc/?trackingId=rjd1iXM9SU%2Bh4wTSGqszHw%3D%3D.

Hattie, J., & Timperley, H. (2007). The Power of Feedback. Review of Educational Research, 77(1), 81-112. https://doi.org/10.3102/003465430298487

Villarroel, V., Bloxham, S., Bruna, D., Bruna, C., & Herrera-Seda, C. (2017). Authentic assessment: creating a blueprint for course design. Assessment & Evaluation in Higher Education, 43(5), 840-854. https://doi.org/10.1080/02602938.2017.1412396

Villarroel, V., Boud, D., Bloxham, S., Bruna, D., & Bruna, C. (2019). Using principles of authentic assessment to redesign written examinations and tests. Innovations in Education and Teaching International, 57(1), 38-49. https://doi.org/10.1080/14703297.2018.1564882

Zou, X., Su, P., Li, L., & Fu, P. ((2024). AI-generated content tools and students’ critical thinking: Insights from a Chinese university. IFLA Journal, 50(2), 228-241. https://doi.org/10.1177/03400352231214963


Dr Gerhard Kristandl is a National Teaching Fellow and an award-winning Associate Professor at Greenwich Business School (GBS), renowned for integrating technology in higher education. His 18-year career across the UK, Canada, and Austria has been marked by advances in blended, hybrid, and hy-flex teaching methods, and a focus on Generative AI in higher education. His role as GBS Learning Technologist until August 2023 led to several key technology-based educational initiatives.

A key member of the University of Greenwich AI Task Force and the Online Assessment Workgroup, Dr Kristandl contributed significantly to these crucial areas. As an Empowered Presenter and the university lead for Mentimeter since January 2021, he brings unique insights into interactive learning. He was awarded National Teaching Fellowship in 2024 (the first out of GBS), Senior Fellowship in Higher Education in 2018, and has been influential in HE Learning and Teaching in close collaboration with ALE since November 2023.

Dr Kristandl’s research spans learning technology and generative AI, extending from his work in intellectual capital. His background in management consulting with Accenture Germany enriches his academic pursuits, which began in 2010.

X handle: @DrKristandl