Work
NonLinear Programming (NLP): Subgradient optimization
Public DepositedSubgradient Optimization (or Subgradient Method) is an iterative algorithm for minimizing convex functions, used predominantly in Nondifferentiable optimization for functions that are convex but nondifferentiable. It is often slower than Newton's Method when applied to convex differentiable functions, but can be used on convex nondifferentiable functions where Newton's Method will not converge. It was first developed by Naum Z. Shor in the Soviet Union in the 1960's.
- Last modified
- 11/30/2018
- Creator
- DOI
- Keyword
- Rights statement
Relationships
- In Collection:
Items
Thumbnail | Title | Date Uploaded | Visibility | Actions |
---|---|---|---|---|
8_Subgradient_optimization_-_optimization.pdf | 2018-11-30 | Public |
|