Many point estimation problems in robotics, computer vision and machine learning are formulated as instances of the general problem of minimizing a sparse nonlinear sum-of-squares objective function. For inference problems of this type, each input datum gives rise to a summand in the objective, and therefore performing online inference corresponds to solving a sequence of sparse nonlinear least-squares problems in which additional summands are added to the objective over time. In this paper we present Robust Incremental least-Squares Estimation (RISE), an incrementalized version of the Powell’s Dog-Leg numerical optimization method suitable for use in online sequential sparse least-squares minimization. As a trust-region method, RISE is naturally robust to nonlinearity and numerical ill-conditioning, and is provably globally convergent for a broad class of loss functions (twice-continuously differentiable functions with bounded sublevel sets). Consequently, RISE maintains the speed of current state-of-the-art online sparse least-squares methods while providing superior reliability.