We propose a logic gate leakage model based on transistor stacks, which includes local transistor level process variation parameters along with global process variation parameters and supply and temperature. The stack models include both subthreshold as well as gate leakage and consider the input vector state. We examine cells from an industrial standard cell library and find that most cells can be modeled with simple stacks, which have a linear chain of transistors. However some gates like XOR, Majority or Muxes need complex stacks and we show how these can be modeled. Our experiments show that only 18 different stack models are needed to predict the leakage of all gates in this industrial library. Re-use of the same models for pass transistor logic circuits and multi-finger transistors is also demonstrated. We explicitly include voltage and temperature into the models to support joint estimation of power supply IR drops and leakage currents, as well as enable analysis for dynamic voltage scaling applications. We use artificial neural networks to create unified models which include global and local process variations, supply voltage in the range of VDD/2 – VDD and temperature in the range 0–100 °C. These models are very useful for performing statistical leakage analysis of large circuits. Results from the ISCAS'85 benchmark circuits show that neural network based stack models can predict the PDF of leakage current of large circuits across supply voltage and temperature accurately with the average error in mean being less than 2% and that in standard deviation being less than 7% when compared to SPICE. Further gate level validation has been done for both an industrial 130 nm and 45 nm PTM model files.