Asynchronous client updates for vertically federated Bayesian inference

Develop asynchronous client update algorithms for the augmented-variable and power-likelihood vertically federated Bayesian models trained with structured or mean-field variational inference, enabling clients to submit updates without strict synchronization in the proposed framework.

Background

The paper introduces the first Bayesian framework for vertical federated learning (VFL), using asymptotically-exact data augmentation to create conditional independence across clients and adapting structured federated variational inference to enable distributed inference. The algorithms presented operate in a synchronous fashion, where the server coordinates gradient information and updates, which can create bottlenecks as the number of clients grows.

The authors explicitly identify asynchronous client updates as an open question in distributed Bayesian inference and highlight it as a promising avenue to enhance the practical utility of their VFL framework. Asynchronous updates would allow clients to update local parameters without strict coordination, potentially improving scalability, latency, and robustness in real-world deployments.

References

Asynchronous client updates, an open question in most distributed Bayesian inference settings \parencite{winter2024emerging}, represent another avenue to elevate the framework's practical utility.

Scalable Vertical Federated Learning via Data Augmentation and Amortized Inference  (2405.04043 - Hassan et al., 2024) in Section 6 (Discussion)