Continuous power ratings are a staple of performance specifications for audio amplifiers and, sometimes, loudspeakers. They are often referred to as "RMS power," derived from Root mean square (RMS), a method for measuring AC voltage or current as an equivalent heating value in DC. "RMS" is technically a misnomer when referring to power despite its frequent colloquial use. In its 1974 Amplifier Rule meant to combat the unrealistic power claims made by many hi-fi amplifier manufacturers, the FTC prescribed continuous power measurements performed with sine wave signals on advertising and specification citations for amplifiers sold in the US. Typically, an amplifier's power specifications are calculated by measuring its RMS output voltage, with a continuous sine wave signal, at the onset of clipping—defined arbitrarily as a stated percentage of total harmonic distortion (THD)—into specified load resistances. Typical loads used are 8 and 4 ohms per channel; many amplifiers used in professional audio are also specified at 2 ohms.
Continuous power measurements do not actually describe the highly varied signals found in audio equipment but are widely regarded as a reasonable way of describing an amplifier's maximum output capability. Most amplifiers are capable of higher power if driven further into clipping, with corresponding increases in harmonic distortion, so the continuous power output rating cited for an amplifier should be understood to be the maximum power (at or below a particular acceptable amount of harmonic distortion) in the frequency band of interest. For audio equipment, this is nearly always the nominal frequency range of human hearing, 20Hz to 20 kHz. Other electronic equipment is intended to handle other frequency bands.