Evaluating Assessment Practices in Team-Based Computing Capstone Projects

Hooshangi, Sara, Shakil, Asma, Riddle, Steve, Aydin, Ilknur, Nasir, Nayla, Parupudi, Tejasvi, Rehman, Attiqa, Scott, Michael ORCID logoORCID: https://orcid.org/0000-0002-6803-1490, Vahrenhold, Jan, Weerasinghe, Amali and Wu, Xi (2026) Evaluating Assessment Practices in Team-Based Computing Capstone Projects. In: Proceedings of the 2025 Working Group Reports on Innovation and Technology in Computer Science Education. Association for Computing Machinery, New York, NY, USA. ISBN 979-8-4007 (In Press)

[thumbnail of Accepted Manuscript]
Preview
Text (Accepted Manuscript)
iticse_2025_wg_preprint.pdf - Accepted Version
Available under License Creative Commons Attribution.

Download (1MB) | Preview

Abstract / Summary

Team-based capstone projects are vital in preparing computer science students for real-world work by developing teamwork, communication, and industry-relevant technical skills. Their assessment, however, is challenging, requiring alignment between academic criteria and external stakeholder expectations, fair evaluation of individual contributions, recognition of diverse skills, and clarity on external partners’ involvement in the evaluation process. The high stakes of these projects further demand transparent and equitable assessment methods that are perceived as fair by all involved. Our working group (WG) addresses the challenges of capstone project assessment by examining the perspectives of instructors, students, and external stakeholders to support fair and effective evaluation. Building on insights from our previous WG and a comprehensive review of the literature, we used a mixed-methods approach combining online surveys (quantitative) and in-depth interviews (qualitative) with instructors, students, and external stakeholders. In total, we collected 66 survey responses and conducted 30 interviews across multiple countries and institutions, capturing a diverse range of global perspectives on capstone course assessments. Insights from instructors and students revealed several commonalities, for example, in the types of assessed components and the challenges of identifying and addressing non-contributing group members. Our findings also revealed clear variation between instructor and student perspectives on how contributions are measured and weighted. Instructors were reluctant to rely heavily on peer or self-evaluation due to concerns about reliability, preferring scaffolded assessments and early-warning systems to gather contribution data and moderate team dynamics. They viewed contribution-based grading as positive but resource-intensive. Students, in contrast, emphasized the need for more transparency, formative feedback, and accurate recognition of individual contributions. They also expressed concerns about the lack of recognition for hidden labor (e.g., project management, team coordination), assessor inconsistency, and a reluctance to critique peers. Instructors treated peer input as supplementary evidence, whereas students perceived it as high-stakes and socially risky. Stakeholder involvement in assessment was generally limited to providing formative feedback and participating in final showcase events. We also identified generative AI as a rapidly evolving challenge, with both students and instructors seeking guidance on acceptable use and exploring opportunities to automate aspects of assessment. Our results offer actionable evidence-based guidance for designing transparent and equitable assessment practices in team-based computing capstones.

Item Type: Book Section
Additional Information: "© {Owner/Author} {2025}. This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of Record will be published in Proceedings of the 2025 Working Group Reports on Innovation and Technology in Computer Science Education, http://dx.doi.org/10.1145/{number}."
ISBN: 979-8-4007
Subjects: Computing & Data Science
Education
Research
Social Sciences
Department: Games Academy
Depositing User: Michael Scott
Date Deposited: 08 Jan 2026 14:26
Last Modified: 08 Jan 2026 14:26
URI: https://repository.falmouth.ac.uk/id/eprint/6306
View Item View Record (staff only)