Publications by T. Moudiki
Automated hyperparameter tuning using any conformalized surrogate
Bayesian optimization(BO) is widely used for Machine Learning hyperparameter tuning. BO relies mainly on a probabilistic model of the objective function (generally a Gaussian process model that approximates the objective function) called the surrogate and improved sequentially, and an acquisition function that allows to select the next point to eva...
1372 sym Python (8141 sym/11 pcs) 1 img
Recognizing handwritten digits with Ridge2Classifier
This post is about Ridge2Classifier, a classifier that I presented 5 years ago in this document. It’s now possible to choose starting values of the (likelihood) optimization algorithm which are solutions from least squares regression. Not always better, but can be seen as a new hyperparameter. Also, Ridge2Classifier used to fail miserably on digi...
764 sym Python (4367 sym/11 pcs) 1 img
Forecasting the Economy
Keep in mind that there’s no hyperparameter tuning in these examples. Hyperparameter tuning must be used in practice. Looking for reticulate and rpy2 experts to discuss speedups for this R package (port from the stable Python version) installation and loading. There’s still room for improvement in this R port, especially in terms of data struct...
1202 sym R (2777 sym/2 pcs) 4 img
A detailed introduction to Deep Quasi-Randomized ‘neural’ networks
A few weeks ago in #112 and #120, I presented a few Python examples of Deep Quasi-Randomized ‘neural’ networks (QRNs). In this post, I will provide a detailed introduction to this new family of models, with examples in Python and R, and a preprint.Link to the preprintLink to a Jupyter Python notebook with a benchmark on 14 data sets: https://g...
1517 sym R (2411 sym/4 pcs) 1 img
Probability of having a credit; using learningmachine
In this post, I examine a data set available in the UCI Machine Learning repository: German Credit. This data set contains characteristics of 1000 bank clients, explanatory variables, and a variable indicating whether the client has good or bad chances of obtaining a credit.We will use Machine Learning models available in R package learningmachine....
1923 sym 1 img
Probability of having a credit; using learningmachine
In this post, I examine a data set available in the UCI Machine Learning repository: German Credit. This data set contains characteristics of 1000 bank clients, explanatory variables, and a variable indicating whether the client has good or bad chances of obtaining a credit. We will use Machine Learning models available in R package learningmachine...
2172 sym R (22735 sym/8 pcs) 2 img
mlsauce’s `v0.18.2`: various examples and benchmarks with dimension reduction
This week’s post is about mlsauce (again), and LSBoost in particular. No new working paper (still working on it), but:An updated R version, working at least on Linux and macOS (Windows users, if not working on your machine, give a try to the Windows Subsystem for Linux, WSL)A new updated documentation pageMy first StackOverflow question ever (sti...
864 sym Python (9745 sym/16 pcs) 3 img
mlsauce’s `v0.17.0`: boosting with Elastic Net, polynomials and heterogeneity in explanatory variables
Last week in #135, I talked about mlsauce’s v0.13.0, and LSBoost in particular. When using LSBoost, it’s now possible to:Obtain prediction intervals for regression, notably by employing Split Conformal Prediction.Take into account an a priori heterogeneity in explanatory variables through clustering.In v0.17.0, I added a 2 new features to LSBoo...
2235 sym Python (13525 sym/8 pcs) 2 img
mlsauce’s `v0.13.0`: taking into account inputs heterogeneity through clustering
Last week in #134, I talked about mlsauce’s v0.12.0, and LSBoost in particular. As shown in the post, it’s now possible to obtain prediction intervals for the regression model, notably by employing Split Conformal Prediction.Right now (looking for ways to fix it), the best way to install the package, is to use the development version:pip instal...
1231 sym 1 img
mlsauce’s `v0.12.0`: prediction intervals for LSBoostRegressor
Many of you (> 2600 reads so far) are reading this document on LSBoost, a gradient boosting algorithm for penalized nonlinear least squares. This never ceases to amaze me, because this document is quite… empty 🙂mlsauce’s v0.12.0 includes prediction intervals for the LSBoostRegressor in particular. These prediction intervals are obtained thro...
684 sym Python (14876 sym/5 pcs) 1 img