Input: Machine learning algorithm L, client set C = {C1,C2,…,Cn}, local dataset D = {D1,D2,…,Dn}, server set S = {S1,S2,…,Sm}, number of server nodes t′ (2 ≤ t′ < m) that can tolerate collusion, minimum number of clients K for uploading parameters per round, total number of training rounds R |
Output: Trained model M |
01: | S1 initializes the global model M |
02: | for r ← 1 to R |
03: | for Ci∈C do |
04: | if r = 1 |
05: | Download the initial global model M from S1 |
06: | else |
07: | Download the share M*,jr−1 from Si,j ∈{1,2,…,m} |
08: | Mr−1←SecRec(M*,jr−1,⋯,M*,mr−1) |
09: | end-if |
10 | local training Mir←(Mr−1,Di) |
11: | clipping weights Mir/max(1,| Mir |/C) |
12: | adding noise Mir←Mir+noise(ε,δ,C,K) |
13: | t ← t′ + 1 |
14: | computing shares (Mi,1r,⋯,Mi,mr)←SecShr(Mi,m,tr) |
15: | send Mi,jr to Si,j |
16: | end-for |
17: | for Sj ∈S do |
18: | wait until enough parameters are collected to update the parameters |
19: | aggregate share M*,jr←(Mi1,jr+⋯+MiK,jr)/K |
20: | end-for |
21: | end-for |
22: | download shares M*,jR, recover MR |
23: | output MR |