Publications by T. Moudiki
Conformalized adaptive (online/streaming) learning using learningmachine in Python and R
The model presented here is a frequentist – conformalized – version of the Bayesian one presented last week in #152. The model is implemented in learningmachine, both in Python and R. Model explanations are given as sensitivity analyses.0 – install packagesFor Rutils::install.packages(c("rmarkdown", "reticulate", "remotes")) Installing packag...
572 sym R (25107 sym/47 pcs) 6 img
Auto XGBoost, Auto LighGBM, Auto CatBoost, Auto GradientBoosting
I’ve always wanted to have a minimal unified interface to XGBoost, CatBoost, LightGBM and sklearn's GradientBoosting, without worrying about the different parameters names aliases. So, I had a lot of fun creating unifiedbooster (which is not part of Techtonique, but is a personal swiss knife tool, under the MIT License).In unifiedbooster, there ...
1461 sym R (5200 sym/12 pcs) 1 img
Copulas for uncertainty quantification in time series forecasting
On Friday (2024-07-26), I presented nnetsauce (“Probabilistic Forecasting with nnetsauce (using Density Estimation, Bayesian inference, Conformal prediction and Vine copulas)”) version 0.23.0 at an sktime (a unified interface for machine learning with time series) meetup. The news for 0.23.0 are:A method cross_val_score: time series cross-valid...
1745 sym 1 img
Forecasting uncertainty: sequential split conformal prediction + Block bootstrap (web app)
This post was firstly submitted to the Applied Quantitative Investment Management group on LinkedIn. It illustrates a recipe implemented in Python package nnetsauce for time series forecasting uncertainty quantification (through simulation): sequential split conformal prediction + block bootstrapUnderlying algorithm:Split data into training set, ca...
1074 sym 1 img
learningmachine for Python (new version)
Last week, I (re)introduced learningmachine, an R package for Machine Learning that includes uncertainty quantification for regression and classification, and explainability through sensitivity analysis. This week, I talk about learningmachine for Python. The Python version is a port of the R package, which means:It’s faster to install if R is al...
1241 sym Python (3728 sym/2 pcs) 1 img
My presentation at ISF 2024 conference (slides with nnetsauce probabilistic forecasting news)
Here are thes slides of my presentation at the International Symposium on Forecasting today:https://www.researchgate.net/publication/381957724_Probabilistic_Forecasting_with_RandomizedQuasi-Randomized_networks_presentation_at_the_International_Symposium_on_Forecasting_2024I discussed probabilistic Forecasting with quasi-randomized networks, sequent...
993 sym 1 img
10 uncertainty quantification methods in nnetsauce forecasting
This week, I released (Python) version 0.22.4 of nnetsauce. nnetsauce now contains 10 uncertainty quantification methods for time series forecasting:gaussian: simple, fast, but: assumes stationarity of Gaussian in-sample residuals and independence in the multivariate casekde: based on Kernel Density Estimation of in-sample residualsbootstrap: based...
1613 sym
Forecasting with XGBoost embedded in Quasi-Randomized Neural Networks
Next week, I’ll present nnetsauce’s (univariate and multivariate probabilistic) time series forecasting capabilities at the 44th International Symposium on Forecasting (ISF) (ISF) 2024. ISF is the premier forecasting conference, attracting the world’s leading forecasting (I don’t only do forecasting though) researchers, practitioners, and ...
1929 sym Python (1969 sym/6 pcs) 8 img
Forecasting Monthly Airline Passenger Numbers with Quasi-Randomized Neural Networks
This post is about forecasting airline passenger numbers with quasi-randomized neural networks, and most specifically using nnetsauce’s class MTS. MTS stands for ‘Multivariate Time Series’, but MTS can also be used for univariate time series as shown in this post.The data used here is the famous AirPassengers dataset, a time series of monthly...
1978 sym Python (586 sym/7 pcs) 4 img
Automated hyperparameter tuning using any conformalized surrogate
Bayesian optimization(BO) is widely used for Machine Learning hyperparameter tuning. BO relies mainly on a probabilistic model of the objective function (generally a Gaussian process model that approximates the objective function) called the surrogate and improved sequentially, and an acquisition function that allows to select the next point to eva...
1372 sym Python (8141 sym/11 pcs) 1 img