## Hybrid AI for Maternal Health: A Clinician-Validated Approach Machine learning shows promise for maternal health risk prediction. However, clinical adoption, especially in resource-constrained settings, faces a barrier: lack of explainability and trust. A recent study presented a hybrid explainable AI (XAI) framework combining ante-hoc fuzzy logic with post-hoc SHAP explanations. This approach was validated through systematic clinician feedback in Bangladesh. ## Model Details and Results Engineers developed a fuzzy-XGBoost model on 1,014 maternal health records, achieving 88.67% accuracy (ROC-AUC: 0.9703). A validation study with 14 healthcare professionals revealed a strong preference for hybrid explanations (71.4% across three clinical cases) and 54.8% expressed trust for clinical use. SHAP analysis identified healthcare access as the primary predictor. Clinicians valued the integrated clinical parameters but identified critical gaps: obstetric history, gestational age, and connectivity barriers. This work demonstrates that combining interpretable fuzzy rules with feature importance explanations enhances both utility and trust, providing practical insights for XAI deployment in maternal healthcare.