I think you have BatchNorm confused with Softmax.
To answer your questions in the comments, normalization does not change the distribution - it simply centers it at 0 with unit variance.
For example, if the data was from a uniform distribution, it remains uniform after normalizing, albeit with different statistics.
For example, take the distribution below:

After normalizing, this is what the distribution looks like:

Notice that the shape of the overall distribution and number of samples in each bucket is exactly the same - what has changed is the mean value (i.e., center) of the distribution. And though not visually obvious, one can check the new normalized values (X-axis of the plot) and see that the variance is approximately 1.
This is precisely what BatchNorm does, with the X-axis being each example in a batch. For other kinds of norms, the dimension taken to normalize over changes (e.g., from the batch dimension to feature dimension in LayerNorm), but the effect is essentially the same.
If you wanted probabilities, you could simply divide the size of each bin by the number of samples (scale the Y-axis instead of the X-axis)! This would give a graph of the exact same shape, with the X-axis values the same as the original graph and the Y-axis values scaled to represent probabilities!
Let's now see what Softmax does to the distribution. Applying softmax over the distribution gives the following graph:

As you can see, softmax actually creates a probability distribution over the points, meaning, it gives a probability of how likely each point is assuming they all are sampled from a Gaussian distribution (the Gaussian part is important theoretically since that is what gives e in the softmax expression).
In contrast, simply scaling the Y-axis with the number of samples does not make the Gaussian assumption - it simply creates a distribution from the given points. Since the probability of any point outside this distribution will be 0, it is useless for generalization. Hence, softmax is used instead of simply creating probabilities out of sample points.
BatchNormconfused withSoftmax.