MURAL - Maynooth University Research Archive Library



    On Analog Distributed Approximate Newton with Determinantal Averaging


    Sharma, Ganesh and Dey, Subhrakanti (2022) On Analog Distributed Approximate Newton with Determinantal Averaging. 2022 IEEE 33rd Annual International Symposium on Personal, Indoor and Mobile Radio Communications (PIMRC). pp. 1-7.

    [thumbnail of SD_onanalog.pdf]
    Preview
    Text
    SD_onanalog.pdf
    Available under License Creative Commons Attribution Non-commercial Share Alike.

    Download (466kB) | Preview

    Abstract

    This paper considers the problem of communication and computation-efficient distributed learning via a wireless fading Multiple Access Channel (MAC). The distributed learning task is performed over a large network of nodes containing local data with the help of an edge server coordinating between the nodes. The information from each distributed node is transmitted as an analog signal through a noisy fading wireless MAC, using a common shaping waveform. The edge server receives a superposition of the analog signals, computes a new parameter estimate and communicates it back to the nodes, a process which continues until an appropriate convergence criterion is met. Unlike typical Federated learning approaches based on communication of local gradients and averaging at the edge server, in this paper, we investigate a scenario where the local nodes implement a second order optimization technique known as Determinantal Averaging. The communication complexity at each iteration per node of this method is the same as any gradient based method, i.e. O(d), where d is the number of parameters. To reduce the computational load at each node, we also employ an approximate Newton method to compute the local Hessians. Under the usual assumptions of convexity and double differentiability on the local objective functions, we propose an algorithm titled Distributed Approximate Newton with Determinantal Averaging (DANDA). The state-of-art first and second-order distributed optimization algorithms are numerically compared with DANDA on a standard dataset with least squares based local objective functions (linear regression). Simulation results illustrate that DANDA not only displays faster convergence compared to gradient-based methods, but also compares favourably with exact distributed Newton methods, such as LocalNewton.
    Item Type: Article
    Keywords: Distributed Learning; DL; Analog Transmission; Fading Multiple Access Channel; MAC; Approximate Newton methods;
    Academic Unit: Faculty of Science and Engineering > Electronic Engineering
    Item ID: 20587
    Identification Number: 10.1109/PIMRC54779.2022.9977466
    Depositing User: Subhrakanti Dey
    Date Deposited: 22 Sep 2025 11:05
    Journal or Publication Title: 2022 IEEE 33rd Annual International Symposium on Personal, Indoor and Mobile Radio Communications (PIMRC)
    Publisher: IEEE
    Refereed: Yes
    Related URLs:
    URI: https://mural.maynoothuniversity.ie/id/eprint/20587
    Use Licence: This item is available under a Creative Commons Attribution Non Commercial Share Alike Licence (CC BY-NC-SA). Details of this licence are available here

    Repository Staff Only (login required)

    Item control page
    Item control page

    Downloads

    Downloads per month over past year

    Origin of downloads