Skip to content

buikem05/Calculus-Probability-Foundations

Repository files navigation

📉 Week 4: Gradient Descent for Sales Prediction

🧠 Topic: Calculus + Probability Foundations in Machine Learning

In this week, I implemented gradient descent from scratch to predict product sales based on advertising spending across TV, Radio, and Newspaper channels.
This project demonstrates how calculus (gradients) and probability/statistics (error minimization) combine to form the backbone of machine learning optimization.


📂 Dataset

Advertising Dataset (UCI Repository)
Each record includes:

  • TV — advertising budget spent on TV
  • Radio — advertising budget spent on Radio
  • Newspaper — advertising budget spent on Newspaper
  • Sales — resulting sales value

Source: UCI ML Repository


🧮 Project Steps

  1. Loaded the Advertising dataset using pandas.
  2. Normalized features to improve gradient descent convergence.
  3. Initialized random weights and bias.
  4. Implemented gradient descent manually:
    • Predicted sales
    • Computed cost using Mean Squared Error (MSE)
    • Updated weights and bias using gradients
  5. Plotted cost reduction over iterations.
  6. Visualized regression line comparing predictions vs actual sales.

📊 Results

🔹 Gradient Descent Convergence

Shows how the model error (cost) decreases with each iteration, proving the optimization worked.

Gradient Descent Convergence

🔹 Actual vs Predicted Sales Visualization

This plot compares the model’s predicted sales to the actual sales values from the test dataset.

Each point represents one observation.

The closer the points are to the diagonal dashed line, the better the model’s predictions.

A perfectly accurate model would have all points exactly on that line..

Sales Prediction


💡 Insights & Observations

  • The cost vs iteration plot shows smooth convergence, indicating a well-tuned learning rate.
  • TV and Radio had stronger predictive influence on sales than Newspaper.
  • Gradient Descent effectively optimized weights to minimize prediction error.
  • This exercise reinforces how derivatives (gradients) guide learning in models.

🧰 Requirements

Install all dependencies using:

pip install -r requirements.txt

About

Implemented gradient descent from scratch to predict sales using the Advertising dataset — exploring how calculus and probability drive ML optimization.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors