This paper addresses histogram burstiness, defined as the tendency of histograms to feature peaks out of pro- portion with their general distribution. After highlighting the impact of this growing issue on computer vision prob- lems and the need to preserve the distribution informa- tion, we introduce a new normalization based on a Gaus- sian fit with a pre-defined variance for each datum that suppresses burst without adversely affecting the distribu- tion. Experimental results on four public datasets show that our normalization scheme provides a staggering per- formance boost compared to other normalizations, even al- lowing Gaussian-normalized Bag-of-Words to perform sim- ilarly to intra-normalized Fisher vectors.