Header menu link for other important links
X
Sparse artificial neural networks using a novel smoothed LASSO penalization
Published in Institute of Electrical and Electronics Engineers Inc.
2019
Volume: 66
   
Issue: 5
Pages: 848 - 852
Abstract
Artificial neural networks (ANNs) are increasingly being used for a variety of machine learning problems. However, increased density of interconnections in ANNs leads to high computational and power requirements. One way to reduce the power is to reduce the number of interconnections which can be achieved using least absolute shrinkage and selection operator (LASSO) techniques. In this brief, we propose an alternative smoothing function to LASSO regularization and an incremental pruning algorithm on feedforward ANNs with an aim of achieving maximally sparse networks with minimal performance degradation. We compare the results obtained using the proposed smoothing function with the existing smoothing functions. Further, we also evaluate the performance of the proposed incremental pruning algorithm. © 2004-2012 IEEE.
About the journal
JournalData powered by TypesetIEEE Transactions on Circuits and Systems II: Express Briefs
PublisherData powered by TypesetInstitute of Electrical and Electronics Engineers Inc.
ISSN15497747
Open AccessNo
Concepts (14)
  •  related image
    Electric power system interconnection
  •  related image
    Functions
  •  related image
    Learning algorithms
  •  related image
    Neural networks
  •  related image
    Support vector machines
  •  related image
    INCREMENTAL PRUNING ALGORITHMS
  •  related image
    LASSO REGULARIZATION
  •  related image
    LEAST ABSOLUTE SHRINKAGE AND SELECTION OPERATORS
  •  related image
    MACHINE LEARNING PROBLEM
  •  related image
    Performance degradation
  •  related image
    Power requirement
  •  related image
    SMOOTHING FUNCTION
  •  related image
    Sparsity
  •  related image
    Regression analysis