Publications by T. Moudiki

Maximizing your tip as a waiter (Part 2)

09.07.2020

In Part 1 of “Maximizing your tip as a waiter”, I talked about a target-based categorical encoder for Statistical/Machine Learning, firstly introduced in this post. An example dataset of tips was used for the purpose, and we’ll use the same dataset today. Here is a snippet of tips: Based on these informations, how would you maximize your t...

5234 sym Python (1834 sym/4 pcs) 18 img

Maximizing your tip as a waiter (Part 2)

09.07.2020

In Part 1 of “Maximizing your tip as a waiter”, I talked about a target-based categorical encoder for Statistical/Machine Learning, firstly introduced in this post. An example dataset of tips was used for the purpose, and we’ll use the same dataset today. Here is a snippet of tips: Based on these informations, how would you maximize your t...

5234 sym Python (1834 sym/4 pcs) 18 img

nnetsauce version 0.5.0, randomized neural networks on GPU

16.07.2020

nnetsauce is a general purpose tool for Statistical/Machine Learning, in which pattern recognition is achieved by using quasi-randomized networks. A new version, 0.5.0, is out on Pypi and for R: Install by using pip (stable version): pip install nnetsauce --upgrade Install from Github (development version): pip install git+https://github.com/...

2755 sym R (238 sym/5 pcs) 2 img

LSBoost: Explainable ‘AI’ using Gradient Boosted randomized networks (with examples in R and Python)

23.07.2020

Disclaimer: I have no affiliation with The Next Web (cf. online article) A few weeks ago I read this interesting and accessible article about explainable AI, discussing more specifically self-explainable AI issues. I’m not sure – anymore – if there’s a mandatory need for AI models that explain themselves, as there are model-agnostic tools...

3277 sym R (7575 sym/10 pcs) 4 img

LSBoost: Explainable ‘AI’ using Gradient Boosted randomized networks (with examples in R and Python)

23.07.2020

Disclaimer: I have no affiliation with The Next Web (cf. online article) A few weeks ago I read this interesting and accessible article about explainable AI, discussing more specifically self-explainable AI issues. I’m not sure – anymore – if there’s a mandatory need for AI models that explain themselves, as there are model-agnostic tools...

3277 sym R (7575 sym/10 pcs) 4 img

LSBoost: Explainable ‘AI’ using Gradient Boosted randomized networks (with examples in R and Python)

23.07.2020

Disclaimer: I have no affiliation with The Next Web (cf. online article) A few weeks ago I read this interesting and accessible article about explainable AI, discussing more specifically self-explainable AI issues. I’m not sure – anymore – if there’s a mandatory need for AI models that explain themselves, as there are model-agnostic tools...

3277 sym R (7575 sym/10 pcs) 4 img

Explainable ‘AI’ using Gradient Boosted randomized networks Pt2 (the Lasso)

30.07.2020

This post is about LSBoost, an Explainable ‘AI’ algorithm which uses Gradient Boosted randomized networks for pattern recognition. As we’ve discussed it last week LSBoost is a cousin of GFAGBM’s LS_Boost. In LSBoost, more specifically, the so called weak learners from LS_Boost are based on randomized neural networks’ components and var...

1481 sym R (5375 sym/4 pcs) 6 img

Explainable ‘AI’ using Gradient Boosted randomized networks Pt2 (the Lasso)

30.07.2020

This post is about LSBoost, an Explainable ‘AI’ algorithm which uses Gradient Boosted randomized networks for pattern recognition. As we’ve discussed it last week LSBoost is a cousin of GFAGBM’s LS_Boost. In LSBoost, more specifically, the so called weak learners from LS_Boost are based on randomized neural networks’ components and var...

1481 sym R (5375 sym/4 pcs) 6 img

Explainable ‘AI’ using Gradient Boosted randomized networks Pt2 (the Lasso)

30.07.2020

This post is about LSBoost, an Explainable ‘AI’ algorithm which uses Gradient Boosted randomized networks for pattern recognition. As we’ve discussed it last week LSBoost is a cousin of GFAGBM’s LS_Boost. In LSBoost, more specifically, the so called weak learners from LS_Boost are based on randomized neural networks’ components and var...

1481 sym R (5375 sym/4 pcs) 6 img

Forecasting lung disease progression

01.10.2020

In OSIC Pulmonary Fibrosis Progression competition on Kaggle, participants are tasked to determine the likelihood of recovery (prognosis) of several patients affected by a lung disease. For each patient, the maximum volume of air they can exhale after a maximum inhalation (FVC, Forced Vital Capacity) is measured over the weeks, for approximately...

2833 sym R (2861 sym/4 pcs) 10 img