AI-based gait analysis using deep learning and signal processing for motion understanding.
This thesis explores a video-based approach to identifying Autism Spectrum Disorder (ASD) by analysing how individuals walk, with the aim of developing a more accessible and objective screening method. Instead of relying on specialised clinical equipment, the study uses markerless pose estimation techniques to extract body movements from ordinary walking videos and convert them into structured gait data. These motion patterns are then processed and analysed using both traditional machine learning models and a deep learning approach. A key focus of the work is understanding which aspects of movement—particularly lower-limb joint behaviour, trunk motion, and centre of mass dynamics—help distinguish between ASD and typically developing individuals. The results show that meaningful patterns can be captured from video alone, with model performance comparable to existing studies, even under a stricter evaluation setup using a held-out test set. While the findings are promising, the study also highlights the limitations of small datasets and emphasises the need for further validation on larger and more diverse populations. Overall, the work demonstrates the potential of combining computer vision and machine learning to support early, non-invasive screening tools for ASD.
A production-grade Retrieval-Augmented Generation (RAG) chatbot with multi-query retrieval, evidence...
An evidence-based AI system using RAG to generate ATS-optimized resumes from GitHub and research dat...
Time series forecasting of Bitcoin prices comparing multiple neural network architectures including ...