School of Computer Science, Nanjing University of Information Science and Technology, Nanjing 210044, China.
International Journal of Science and Research Archive, 2026, 19(01), 839-847
Article DOI: 10.30574/ijsra.2026.19.1.0782
Received on 06 March 2026; revised on 18 April 2026; accepted on 21 April 2026
The shift to Electronic Health Records (EHR) has overwhelmed intensive care units with a “data blizzard” over 24,000 data points per bed per hour, causing alarm fatigue (85–99% of alerts false) and cognitive overload. Advanced machine learning predicts deterioration but operates as opaque black boxes, eroding clinical trust. This research develops an interpretable framework combining Extreme Gradient Boosting (XGBoost) with SHapley Additive explanations (SHAP). Using 18,452 adult patients from MIMIC IV, the model predicts a composite deterioration endpoint (mortality, unplanned intubation, vasopressor need) within 24 hours. It achieves strong discrimination (AUROC 0.844, AUPRC 0.672) and full transparency via global and local SHAP explanations. Key drivers include lactate, renal disease, heart rate volatility, and age. This “glass box” paradigm bridges accuracy and clinical utility, reducing preventable errors and alarm fatigue.
Explainable AI; Machine Learning; EHR; Clinical Prediction; SHAP
Preview Article PDF
Jeruelle Rubenne Bafounguila-Bakaloukila. Interpretable predictions for clinical outcomes using electronic health records (EHR). International Journal of Science and Research Archive, 2026, 19(01), 839-847. Article DOI: https://doi.org/10.30574/ijsra.2026.19.1.0782






