Optimization Methods
The key to finding the best solution to any problem.
Optimization is the practice of finding the best solution to a problem given a set of constraints. While that seems straightforward, many 'problems' are extremely complex, and can't be solved in a straightforward fashion. If there are many parameters, or if the problem space is dynamic and changes over time, it's often infeasible to brute force try every possible type of optimization. That's why there are many different optimization methods that can be employed to find good potential solutions to a problem.
There are hundreds of different optimization techniques. Three popular 'categories' of optimization:
→
Differentiable Optimization: In cases where the rate of change can be calculated for a given point in the input space, differentiable functions can be used to find minimums and maximums. Example: Linear Solvers, Gradient Descent.→
Stochastic Algorithms: Algorithms that sample the domain and use randomness to find and compare solutions. Algorithms may be naive and blind to data, or use smart heuristics to find the best answer. Examples: Grid Search, Simulated Annealing, Bayesian.→
Population Algorithms: Algorithms that create a pool of candidate solutions, using them to search through the problem space to find the optimal solution. Like other stochastic algorithms they use randomness to explore more solutions, but with a large pool they can be more robust. Population algorithms may also use evolutionary techniques, combining fit solutions to reach optimal solutions. Examples: Genetic Programming, Particle Swarm.
Many of these techniques may also be used for hyperparameter optimization.
Create a free
account
Sign up to try HASH out for yourself, and see what all the fuss is about