In the literature, many methods are available for estimation of the variance of the noise in magnetic resonance (MR) images. A commonly used method, based on the maximum of the background mode of the histogram, is revisited and a new, robust, and easy to use method is presented based on maximum likelihood (ML) estimation. Both methods are evaluated in terms of accuracy and precision using simulated MR data. It is shown that the newly proposed method outperforms the commonly used method in terms of mean-squared error (MSE).© (2006) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only.