Abstract:Almost all sparse stochastic algorithms are developed from the online setting, and only the convergence rate of average output can be obtained. The optimal rate for strongly convex optimization problems can not be reached as well. The stochastic optimization algorithms are directly studied instead of the online to batch conversation in this paper. Firstly, by incorporating the L2 regularizer into the L1-regularized sparse optimization problems, the strong convexity can be obtained. Then, by introducing the random step-size strategy from the black-box optimization method to the state-of-the-art algorithm-composite objective mirror descent (COMID), a sparse stochastic optimization algorithm based introducing on random step-size hybrid regularized mirror descent (RS-HMD) is achieved. Finally, based on the analysis of characteristics of soft threshold methods in solving the L1-regularized problem, the optimal individual convergence rate is proved. Experimental results demonstrate that sparsity of RS-HMD is better than that of COMID.