-
Notifications
You must be signed in to change notification settings - Fork 0
New ML Algorithm‐01
-
# how the Prototype's new ml algorithm From Scratch:
-
Create a Population: Generate a list of many random potential solutions to your problem.
-
Write a Fitness Function: This is crucial. You code a function that scores how good each solution in your population is.
-
Implement Selection: Write logic to select the "fittest" individuals from the population to be parents for the next generation.
-
Implement Crossover and Mutation: Code a function that takes two parent solutions and combines their parts to create a child (crossover). Then, randomly tweak a small part of that child's solution (mutation).
-
Repeat: You'd wrap this all in a loop that runs for hundreds or thousands of generations until the solutions stop improving.
-
This is a powerful method for optimization problems where the ideal path isn't clear.
-
This approach centers around using probability theory to handle uncertainty. Instead of giving a definite "yes" or "no," the model gives the probability of an outcome.
-
The Philosophy: Model the world and its relationships using statistics and probability distributions.
-
How You'd Build It From Scratch:
-
For a model like a Naive Bayes Classifier, you would write code to:
-
Calculate the base probability of each class (e.g., P(Alert)).
-
Calculate the conditional probability of each feature given a class (e.g., P(IP_Address | Alert)).
-
Combine these probabilities using Bayes' Theorem to make a final prediction.
-
In short, the method you asked about is for building algorithms where the logic is already well-defined. Neural Networks and Evolutionary Algorithms are for creating systems that can learn or discover the logic on their own.