Do "power factor controllers" save energy? "Power factor controller" is a confusing term. Often called a Nola device, it isn't really intended to govern power factor as such. Rather, it electronically adjusts motor voltage to suit motor load, using the motor power factor as an indicator of that load. The device works, but it seldom pays off. When a motor runs fully loaded, its internal power loss is mostly "copper loss" caused by current flow through the windings. If motor voltage is lowered, current must go up to supply the same shaft output; copper loss goes up too, and efficiency tends to drop. In contrast, a lightly loaded motor's internal loss is mostly magnetic loss in the core iron. That drops rapidly when voltage is lowered. Little shaft output is needed, so the reduction does no harm. Net losses go down, and efficiency rises. Total loss can be cut 10, 20, even 50 percent-but the actual watts involved are quite low compared to full-load operation. Unless shaft output averages one-third of rated power, or less, the Nola device normally isn't cost effective-repeated tests confirm just that for three-phase motors of all sizes. The loss picture is different, and the savings greater, for small single-phase motors. But such motors use so little power anyway that the total savings is also small.