Regional Statistics Conference 2026

Regional Statistics Conference 2026

AI and Advanced Statistical Modelling in Digital Health. Applications to brain tumors (Y-BIS/ISBIS)

Organiser

K
Ozan Kocadagli

Participants

  • A
    Dr Nihan Acar Denizli
    (Presenter/Speaker)
  • Decoding the content of working memory with functional Data Analysis

  • ZE
    Zeynep Filiz Eren
    (Presenter/Speaker)
  • A Lightweight and Interpretable Machine Learning Approach versus Deep Learning in Medical Image Classification

  • K
    PROF. DR. Ozan Kocadagli
    (Presenter/Speaker)
  • Detection of EGFR Gene Mutations in Brain Tumors: Leveraging Information Complexity for AI-Based Decision Support Systems

  • B
    Prof. Matilde Bini
    (Discussant)

  • Proposal Description

    This session will spotlight cutting-edge research at the intersection of artificial intelligence, advanced statistical modelling, and digital health technologies, with a strong focus on brain tumor imaging and EEG-based neurological monitoring. EEG systems are increasingly validated as affordable, scalable solutions for brain activity analysis, while MRI-based imaging remains central to oncology diagnostics.
    The session features three complementary contributions:

    Decoding the content of working memory by using functional data analysis (FDA) approach: FDA is one of the most popular advanced statistical methods for analyzing digital health data. Due to its ability to analyze complex, high-dimensional data, FDA has recently become important for health monitoring. This presentation will introduce an application of FDA to real-life EEG data.

    Brain Tumor Mutation Classification and Segmentation with Deep Learning: Leveraging architectures like VGG-16, Inception V3, and ResNet-50, this presentation explores automated tumor detection and segmentation from MRI scans, emphasizing interpretability and lightweight deployment for clinical adoption.

    Interpretable Machine Learning vs. Deep Learning in Medical Imaging: While deep learning excels in predictive accuracy, its black-box nature raises trust concerns. Interpretable ML approaches offer transparency, ensuring explainable decisions in brain imaging and EEG analysis. Systematic reviews highlight the growing role of explainable AI in clinical workflows.

    Together, these talks illustrate how AI and advanced statistical methods are transforming brain health analytics, from EEG signal processing to oncology imaging, while addressing critical needs for accuracy, interpretability, and clinical integration.