Jin Xie, San-Yang Liu, Jia-Xi Chen. A Framework for Distributed Semi-supervised Learning Using Single-layer Feedforward Networks[J]. Machine Intelligence Research, 2022, 19(1): 63-74. DOI: 10.1007/s11633-022-1315-6
Citation: Jin Xie, San-Yang Liu, Jia-Xi Chen. A Framework for Distributed Semi-supervised Learning Using Single-layer Feedforward Networks[J]. Machine Intelligence Research, 2022, 19(1): 63-74. DOI: 10.1007/s11633-022-1315-6

A Framework for Distributed Semi-supervised Learning Using Single-layer Feedforward Networks

  • This paper aims to propose a framework for manifold regularization (MR) based distributed semi-supervised learning (DSSL) using single layer feed-forward neural network (SLFNN). The proposed framework, denoted as DSSL-SLFNN is based on the SLFNN, MR framework, and distributed optimization strategy. Then, a series of algorithms are derived to solve DSSL problems. In DSSL problems, data consisting of labeled and unlabeled samples are distributed over a communication network, where each node has only access to its own data and can only communicate with its neighbors. In some scenarios, DSSL problems cannot be solved by centralized algorithms. According to the DSSL-SLFNN framework, each node over the communication network exchanges the initial parameters of the SLFNN with the same basis functions for semi-supervised learning (SSL). All nodes calculate the global optimal coefficients of the SLFNN by using distributed datasets and local updates. During the learning process, each node only exchanges local coefficients with its neighbors rather than raw data. It means that DSSL-SLFNN based algorithms work in a fully distributed fashion and are privacy preserving methods. Finally, several simulations are presented to show the efficiency of the proposed framework and the derived algorithms.
  • loading

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return