Adaptive transmit beamforming based on channel state information (CSI) is a key feature in next generation wireless cellular systems. However, CSI available for adaptation is imperfect due to feedback delay and estimation errors. In this work, we analyze the outage performance of maximum eigen-mode beamforming with imperfect CSI. First we analyze the outage probability in terms of the correlation coefficient ρ between the CSI available at the transmitter (CSIT) and the CSI available at the receiver (CSIR). The analysis shows that feedback delay leads to significant degradation at medium and high signal-to-noise ratios (SNR). Furthermore, the effect of delay can be overcome only if ρ tends to one with increasing SNR. Then, we study whether linear minimum mean squared error (MMSE) prediction can achieve the required behavior in ρ. The length of the prediction filter required is numerically evaluated and shown to increase with SNR. Finally, the asymptotic diversity order is analyzed as a function of the rate at which 1 - ρ approaches 0 as the SNR → ∞. Results show that for 1 - ρ proportional to SNR -1, the asymptotic diversity order remains unaltered. © 2009 IEEE.