Abstract: Differentially Private Stochastic Gradient Descent (DP-SGD) is a widely adopted algorithm for privately training machine learning models. An inherent feature of this algorithm is the ...
Abstract: In this paper, a distributed online sequential zero-gradient-sum algorithm is proposed, which is based on the discretetime zero-gradient-sum algorithm for directed networks. It also ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results