Power consumption of the receiver (W) = power output (W)/efficiency
So a receiver with an efficiency of 50% will require 500 watts for an audio output of 250 watts.
Given reasonable loads, a receiver should be using about 100 watts for most of its lifetime.
And efficiency is typically in the 65-70% range.
The problem is that budget receivers use very small power supplies. A receiver specified for 100 watt x 5 will typically have a 300-350 watt power supply. Which makes it a generator. But that's another story.
Music is by nature dynamic, and movies are even more dynamic (differences between average and peak sounds is very high), there can be peaks of over 20dB in the overall range. Let's say 26dB, which is 20 times. So for a steady output of 100 watts, you can have a peak power draw of 2000 watts for a few miliseconds at a time. If the amp can survive (and some can!), the power supply should be able to supply it, and the mains should be able to survive it but
The problem is that budget receivers use very small power supplies.
There are tricks to hold up the amplifier for some of these peaks, and usually there is some dynamic headroom in all amps, but this is why amps sound nasty at high levels: poor power supplies. The best amps will specify output power after adjusting for headroom, the budget ones will over-spec and under-deliver.
A transformer is a very good bet, as long as it's wound with an oversized wire and slightly oversized core. This is because it only transfers the energy, it does not create it. So the ratings are more about thermal limits of the winding wire and insulation (lower current rating = thinner wire=more resistance=more heat), and magnetic saturation of the core. Very small transformers can sustain very high power for small durations, but not continuously.
Sorry for the length, but oversizing never hurts, not when it comes to a power supply. Or a post
