I have been reading articles about the ineffectiveness of voltage optimization when it comes to variable speed drives. Could somebody please help by explaining whether or not this is the case.
As an example, if you take a 20HP variable speed drive drawing a current of 10-12 amps, with an input voltage of 415 Volts. The drive is under loaded and the input and base voltage can be reduced to 390V, taking the v/f ratio from 8.3 to 7.8 without impacting on the associated torque. As a result the motor draws less current and consume less power.
However, in some circumstance this voltage optimization doesn’t seem to work and the variable frequncy drive either starts to take more current or shows no change in power consumption at all. Can anyone explain why this is?