Unlike the conventional neural network theories and implementations, Huang et al. [Universal approximation using incremental constructive feedforward networks with random hidden n...
We extend the well-known BFGS quasi-Newton method and its memory-limited variant LBFGS to the optimization of nonsmooth convex objectives. This is done in a rigorous fashion by ge...
We discuss the use in machine learning of a general type of convex optimisation problem known as semi-definite programming (SDP) [1]. We intend to argue that SDP’s arise quite n...