Many modern amps can drive much more current than what is associated with their rated power into either an 8 ohm or 4 ohm speaker. Why are they designed this way? Is it cheap to over-design the current output of an amp?
By way of example I have an amp rated 150W+150W into an 8 ohm speaker. It is also rated 300W+300W into a 4 ohm speaker. It states it can drive 1200W+1200W into 1 ohm for short periods of time. It also states it can drive 2400W into 2 ohms in bridged mode for short periods of time.
So when driving an 8 ohm speaker it drives 150W, which turns out to be 4.33 amps of current into the 8 ohm speaker at 34.64 volts. (To calculate this I used the following two equations; Power = Voltage x Current, and Voltage = Current x Resistance.) When driving 4 ohms the load is half (4 ohms vs. 8 ohms), therefore the current is double (8.66 amps, while the voltage stays the same at 34.64V (power = 34.64 V x 8.66 A = 300W). When driving 1 ohm the current is 8 times the current of when driving 8 ohms. So the current goes up to 34.4 amps (8 x 4.33 amps) for brief moments of time.
So the amp has it’s internal structures built to be able to drive up to 34.6 amps, but yet probably never drives even half that current. For instance say a speaker is nominally 4 ohms, it most likely never dips below 2 ohms in any circumstance (that’s a pretty big dip for a 4 ohm speaker) and therefore the amp never has an opportunity to put out more than 17.3 amps. And for an 8 ohm speaker the current likely never exceeds 8.6 amps even driving the most demanding of 8 ohm loads (so it is 4 x over-designed for an 8 ohm speaker).
So why do the designers build in so much current overhead? It has to be expensive to do so?