Comprehensive implementations of Deep Learning, Quantum AI, Reinforcement Learning, and classical AI algorithms using TensorFlow and Qiskit. Projects include medical image classification, clustering, GANs, QNNs, and search algorithms. All experiments are presented with videos, slides, and runnable Colab notebooks.
- Part I: Skin disease classification (Nevus, Melanoma, Carcinoma) using VGG16 and ResNet50, with and without data augmentation.
- Part II: Implementation of DCGAN on CIFAR-10, with optimization comparison (Gradient Ascent vs. Descent).
- Part III: Transfer Learning strategies for small datasets (both similar and dissimilar to pre-trained domains).
- Part I: Clustering algorithms (Hierarchical, DBSCAN, K-Means) applied to investor profiles (age, income, risk).
- Part II: Decision Trees & Random Forests — comparing Gini vs. Entropy, and performing regression on the Boston Housing dataset.
- Part III: Quantum AI using Qiskit — includes Basis/Amplitude encoding and Quantum Neural Networks (QNNs).
- Part I: Policy Iteration, Value Iteration, and Deep Q-Learning for environments like CartPole and FrozenLake.
- Part II: RNN and LSTM models — intuition, backpropagation, and performance metrics (RMSE, R²).
- Part III: Classic search and CSP algorithms including A*, Min-Max, DFS/BFS/UCS, and Map Coloring.
- Focus on optimizers: SGD + Momentum, Adagrad, RMSprop
- Apply SMOTE to balance imbalanced datasets
- DNN classification with visual performance analysis (loss/accuracy plots)
Goal: Understand and implement core ML algorithms with detailed examples and explanations.
-
K-Nearest Neighbors (KNN)
- Manual Euclidean distance examples
- Weighted and unweighted KNN predictions
- Comparison of k=1 vs. k=3 neighbors
-
Support Vector Machines (SVM)
- Hard vs. Soft Margin classification
- Use of kernels (especially RBF kernel)
- Confidence scoring, margin visualization, and nonlinear transformations
-
Gradient Boosted Trees (GBT)
- Sequential boosting with shallow trees
- Residual error correction
- Iterative training for minimizing prediction errors
-
Extreme Gradient Boosting (XGBoost)
- Parallel tree boosting
- Histogram-based bucketing
- Built-in regularization and optimized memory usage
- Handling sparse data and out-of-core computation
- AI/ML: CNN, GAN, Transfer Learning, Clustering, QNN, RL, RNN/LSTM
- Tools: TensorFlow, Qiskit, Google Colab
- Mathematics: Bellman Equation, Entropy/Gini, Quantum Encoding, Backpropagation, Policy Iteration