3

Unified High-Probability Analysis of Stochastic Variance-Reduced Estimation

This work establishes a generalized theoretical framework for analyzing the convergence of variance-reduced optimization methods, providing tight, high-probability bounds across multiple algorithmic variants.

First-Order Softmax Weighted Switching Gradient Method for Distributed Stochastic Minimax Optimization with Stochastic Constraints

This research introduces a novel Softmax-Weighted Switching Gradient method to address distributed stochastic minimax optimization with stochastic constraints in federated learning environments. By utilizing a single-loop, primal-only switching mechanism, the approach provides a stable alternative for optimizing worst-case client performance without relying on complex dual variables. The work establishes robust convergence guarantees for both full and partial client participation while relaxing boundedness assumptions. The analysis culminates in a unified error decomposition that provides a remarkably sharp logarithmic high-probability convergence guarantee for these constrained problems.

Structural Properties, Cycloid Trajectories and Non-Asymptotic Guarantees of EM Algorithm for Mixed Linear Regression

We derive explicit EM updates for the 2MLR model across all SNR regimes and characterize their properties. We show the updates follow a cycloid trajectory in the noiseless setting and bound the deviation from this trajectory in finite high-SNR regimes. This trajectory-based analysis reveals the population-level convergence orders: linear when near-orthogonal and quadratic when the angle is small. Our novel framework provides non-asymptotic guarantees by tightening statistical error bounds between finite-sample and population updates, linking statistical accuracy to the sub-optimality angle and establishing finite-sample convergence from arbitrary initialization.