AI tool helps make trustworthy, explainable scheduling decisions

William Yeoh and Stylianos Vasileiou developed TRACE-cs, a hybrid system that combines symbolic reasoning with large language models to solve course scheduling problems

Shawn Ballard 
TRACE-cs tool can help students reliably optimize course scheduling and explain why it makes the recommendations it does. (Image: iStock)
TRACE-cs tool can help students reliably optimize course scheduling and explain why it makes the recommendations it does. (Image: iStock)

As artificial intelligence (AI) becomes increasingly integrated into our daily lives, the next significant challenges involve ensuring effective collaboration between AI systems and human users and fostering trust in this technology, say computer scientists William Yeoh and Stylianos Vasileiou at the McKelvey School of Engineering at Washington University in St. Louis.

“Going forward, it’s going to be important for humans to have an appropriate level of trust in what AI can do for them,” Yeoh said. “One way to achieve that is to have the system explain why it’s doing what it’s doing in an interactive and understandable way.”

To address these challenges, Yeoh, associate professor of computer science & engineering, and Vasileiou, a graduate student in Yeoh’s lab, developed TRACE-cs, a novel hybrid tool that tackles the concrete problem of course scheduling. The tool generates accurate explanations efficiently in a quickly digestible format – a novel advancement in the field. Vasileiou presented TRACE-cs on Feb. 28 at the 2025 AAAI Conference on Artificial Intelligence.

TRACE-cs combines symbolic reasoning with the natural language capabilities of large language models (LLMs) to provide trustworthy, easily understandable assistance with complex decision-making tasks. TRACE-cs ensures accuracy by incorporating user verification and allowing users to ask follow-up questions early in the process and usability by prioritizing concision in the explanation for the recommended schedules.

TRACE-cs builds on an explanation algorithm Vasileiou developed to respond to contrastive queries in scheduling problems. It can answer questions like, “Why was course X selected over course Y?” based on encoded scheduling constraints while an LLM processes interactive user queries and refines the output into coherent natural language explanations. This novel hybrid approach ensures users receive clear, concise and reliable explanations while maintaining the robustness and correctness of the AI system.

Vasileiou and Yeoh tested TRACE-cs on real-world academic course scheduling for WashU’s Department of Computer Science & Engineering. The system outperformed traditional LLM-only methods, achieving 100% accuracy in explanation correctness, compared with 44% and 49% for LLM-only methods, and it delivered explanations concisely, averaging 46 words per explanation, compared with 113.3 and 59 words for other methods. 

“Integrating AI systems with human users was a big motivation for this work,” Vasileiou said. “Course scheduling is just one example where AI systems need to be able to explain what they’re doing to assist users in making optimal decisions. These tools aren’t replacing human effort; they’re enabling us to solve more complex problems than a human or AI could solve alone.” 

The research underscores the potential of combining symbolic reasoning with LLMs to develop AI systems that are accurate and comprehensible to human users. TRACE-cs sets the stage for future advancements in explainable AI, offering a model that can be applied to class scheduling at other institutions and potentially extended to numerous decision-making tasks in fields such as health care and logistics.


Vasileiou W, Yeoh W. TRACE-cs: A synergistic approach to explainable course scheduling using LLMs and logic. AAAI Conference on Artificial Intelligence, Philadelphia, Feb. 25-March 4, 2025. DOI: https://arxiv.org/abs/2409.03671

This work was supported by a National Science Foundation grant for collaborative research on end-to-end learning of fair and explainable schedules for court systems (2232055).

Click on the topics below for more stories in those areas

Back to News