Specialising in causal inference, ML explainability, anomaly detection, and AI-powered analytics pipelines — turning raw operational data into decisions that actually get made.
Forensic ML debugging console — investigates why a model got a prediction wrong using SHAP attribution, nearest-neighbour forensics, confidence decomposition, and an AI-written failure report across credit risk, medical, and equipment datasets.
Production-grade anomaly detection across 5 correlated IoT signals using triple-layer detection (Z-score / IQR / CUSUM) and Granger causality for root cause ranking. Features an evidence timeline and AI-written incident report.
Bayesian structural time series tool that answers "did this business decision actually work?" — models a statistical counterfactual and measures the true causal lift of interventions with 95% credible intervals across 4 real business scenarios.
Natural language interface over a multi-table relational database using live schema injection and LLM inference. Validates SQL via DuckDB EXPLAIN before execution and returns results with an AI business explanation.
Auto-charting pipeline that profiles any CSV, infers column types, routes each variable to the right Plotly chart, and uses a local LLM to generate plain-English business narratives per chart — zero cloud API required.
Open to full-time roles in Data Science, Data Analysis, and BI. Reach out via any of the channels below.