I. Introduction
In recent years, intelligent services based on artificial intel-ligence (AI) in space have developed rapidly, such as real-time disaster navigation, global pandemic spread detection, etc [1]. Machine learning (ML) is an essential tool to achieve these applications. The traditional approach to ML is to gather all data in a central location and then solve the learning problem. However, the application of ML in space also faces some problems. For privately owned small satellite stations, it might be prohibited to share the data due to privacy or data ownership concerns. For large satellite stations, the vast amounts of data necessary to train deep neural networks inevitably brings the high communication overhead. As a distributed training mode, federated learning (FL) [2] can be used to solve this dilemma. A low earth orbit (LEO) satellite station can work as a server and collaboratively train an AI model with multiple ground devices (i.e., clients) distributed in different areas. Ground devices do not need to upload local data to the satellite station for centralized training. They simply train the model locally and upload model parameters or gradients to the satellite station for aggregation. Because there is no need to transfer data, FL can greatly reduce communication overhead while providing some privacy protection.