The increasing incidence of adverse events in healthcare, such as perioperative Acute Kidney Injury (AKI), highlights the urgent need for disruptive technologies and innovative approaches to enhance patient safety and outcomes. Among these approaches, AI-driven analytical algorithms, particularly machine learning (ML), have shown significant potential in risk-based decision-making models. However, integrating these technologies into clinical practice presents challenges, especially regarding explainability and trust. This dissertation aims to explore how disruptive technologies, specifically explainable machine learning models, can be utilized in risk-based decision-making to reduce the incidence of adverse events, such as perioperative AKI, in healthcare settings through a systems thinking perspective. To address these challenges, this research utilizes systems thinking principles to develop an explainable ML-based predictive framework specifically designed for perioperative AKI. The framework introduces a tree-based ML model that predicts AKI risk with high accuracy while offering clear interpretations of its predictions through Shapley additive explanation values. These explanations provide both global insights into the factors contributing to AKI risk and local interpretations for individual patient assessments. By emphasizing model transparency, this work seeks to bridge the gap between advanced predictive tools and their practical application in healthcare settings. Additionally, the dissertation explores the broader implications of integrating ML models into clinical workflows, focusing on the role of disruptive technologies in transforming healthcare operations. A key contribution is the development and validation of a user-friendly interface, designed with input from domain experts, to effectively deploy the explainable AKI prediction model. This interface integrates predictive and explainability features to support clinicians in making informed, data-driven decisions, thereby enhancing both patient safety and clinical workflows. The study validates the model’s robustness using local data from the UAE, ensuring that the framework is relevant and applicable to real-world clinical practice. By addressing key gaps in the literature—such as the need for validated predictive models, the impact of AKI on patient outcomes, and the dynamics of human-AI collaboration—this research provides valuable insights into the integration of explainable ML in healthcare. This dissertation contributes to the fields of healthcare operations and predictive analytics by offering a comprehensive framework for deploying explainable ML models in perioperative care. It confirms the importance of systems thinking, transparency, trust, and collaboration between human experts and AI systems in improving patient safety, operational efficiency, and healthcare outcomes.
| Date of Award | 4 Dec 2024 |
|---|
| Original language | American English |
|---|
| Supervisor | Mecit Simsekler (Supervisor) |
|---|
- Operational effeciency
- Healthcare operations
- Predictive modeling
- Data analytics
- Decision support
- Acute kidney injury
- Perioperative care
The Role of Disruptive Technologies and Risk-Based Decision-Making in Healthcare through Systems Thinking View of Analytics
Al Absi, D. (Author). 4 Dec 2024
Student thesis: Doctoral Thesis