Publications by MSDS 6372: Jacob Turner: Student: Jessica McPhaul link:

Module 9 Condensed

03.03.2025

Study Guide: Support Vector Machines (SVMs) – Module 9 1. Introduction to Support Vector Machines Support Vector Machines (SVMs) are powerful supervised learning algorithms primarily used for classification tasks. The goal of SVM is to find an optimal separating hyperplane that best divides different classes of data points. Key Concepts: Li...

19667 sym Python (1997 sym/4 pcs)

QTW Module 9 Vector Machines

02.03.2025

Study Guide: Support Vector Machines (SVMs) – Module 9 1. Introduction to Support Vector Machines SVM is a powerful supervised learning algorithm primarily used for classification tasks. The main goal of SVM is to find an optimal separating hyperplane that best divides different classes of data points. Key Concepts: Linear Separability: Wh...

24548 sym Python (2818 sym/2 pcs)

training stuff

25.02.2025

Step 1: Reinstall Graphormer Since your environment has inconsistencies, let’s properly clean and reinstall Graphormer. 1A. Remove Previous Installations pip uninstall -y graphormer fairseq fairseq2 fairseq2n torch torchvision torchaudio pip cache purge 1B. Install Dependencies pip install torch torchvision torchaudio pip install numpy netwo...

9746 sym R (2375 sym/7 pcs)

fuckkkkkkkkk

24.02.2025

Alright, here’s the step-by-step fix to reinstall Graphormer and start training for your Capstone project. Step 1: Reinstall Graphormer Since your environment is messy after hours of installing Fairseq, let’s clean install Graphormer properly. 1A. Remove Any Old Graphormer Installation Run this first to remove old installations: pip uninst...

6085 sym Python (1268 sym/4 pcs)

boostiing1

24.02.2025

Below is first section of the study guide on Boosting and a Boosting Walk-through. I drew from the Elements of Statistical Learning (ESL) and the Module 8 Asynchronous transcripts you provided. I am writing in first person and keeping the material copy/paste friendly. I include references to the text (ESL, The Elements of Statistical Learning) ...

97553 sym

QTW - Mod 8

24.02.2025

Section 1: Boosting for the study guide. Boosting Study Guide 1. Introduction to Boosting Boosting is a machine learning ensemble technique that converts weak learners into a strong learner by iteratively adjusting their weights based on errors from previous models. Unlike bagging, which works with independent models, boosting models are train...

128174 sym Python (22256 sym/90 pcs) 11 tbl

JMcPhaul_caseStudy3_spamAssassin_QTW

18.02.2025

Case Study 3: Building a Spam Classifier Using Naïve Bayes and Clustering Jessica McPhaul SMU - 7333 - Quantifying the World Date: February 17, 2025 Naïve Bayes Formula Given an email with words \(w_1, w_2, ..., w_n\), the probability that it belongs to spam (\(S\)) is computed as: \[ P(S | w_1, w_2, ..., w_n) = \frac{P(S) \prod_{i=1}^{n} P(w...

103322 sym Python (23380 sym/1 pcs) 2 img 12 tbl

spam dupe?

18.02.2025

Predicting Email Spam: A Study Guide with Mathematical and Coding Representation 1. Introduction to Spam Detection Definition Spam detection is a binary classification problem where emails are categorized as either spam (junk) or ham (legitimate). The problem is solved using machine learning techniques, leveraging statistical patterns in emai...

10532 sym Python (1438 sym/2 pcs)

CaseStudy3_SpamAssassin

18.02.2025

Case Study 3: Building a Spam Classifier Using Naïve Bayes and Clustering Jessica McPhaul SMU - 7333 - Quantifying the World Date: February 17, 2025 Naïve Bayes Formula Given an email with words \(w_1, w_2, ..., w_n\), the probability that it belongs to spam (\(S\)) is computed as: \[ P(S | w_1, w_2, ..., w_n) = \frac{P(S) \prod_{i=1}^{n} P(w...

56314 sym 2 img 11 tbl

bagging

17.02.2025

Bagging: A Study Guide with Mathematical and Coding Representation 1. Introduction to Bagging Definition Bagging (Bootstrap Aggregating) is an ensemble learning technique that improves model stability and accuracy by training multiple models on different random subsets of data and then aggregating their predictions. Bagging is widely used in ...

8321 sym Python (1476 sym/2 pcs)