Concept
Proximal Gradient Method 0
The Proximal Gradient Method is an optimization algorithm designed to solve non-smooth convex optimization problems by splitting the problem into a smooth and a non-smooth part. It iteratively applies a gradient step for the smooth part and a proximal step for the non-smooth part, making it particularly effective for problems with structured sparsity constraints like Lasso regression.
Relevant Degrees