Artificial neural networks (ANNs) are increasingly being used for a variety of machine learning problems. However, increased density of interconnections in ANNs leads to high computational and power requirements. One way to reduce the power is to reduce the number of interconnections which can be achieved using least absolute shrinkage and selection operator (LASSO) techniques. In this brief, we propose an alternative smoothing function to LASSO regularization and an incremental pruning algorithm on feedforward ANNs with an aim of achieving maximally sparse networks with minimal performance degradation. We compare the results obtained using the proposed smoothing function with the existing smoothing functions. Further, we also evaluate the performance of the proposed incremental pruning algorithm. © 2004-2012 IEEE.