Abstract:To improve the efficiency for large-scale nonsmooth optimization problems and overcome the large storage requirements and complex computation of other algorithms, a modified HS conjugate gradient algorithm for nonsmooth optimization problems is proposed. A new search direction based on the classical HS conjugate gradient method is given, then the Moreau-Yosida regularization technique and the Armijo-type line search technique are used to design the algorithm. The sufficient descent condition and the trust region are satisfied for this algorithm. Under suitable conditions, the global convergence of the new algorithm is proved. The preliminary numerical experiments show that the new algorithm is more efficient than the LMBM method for nonsmooth unconstrained optimization problems. The presented algorithm is efficiently for solving nonsmooth optimization problems since it has good convergence property and good numerical performance.