Welcome to the new Energy Central — same great community, now with a smoother experience. To login, use your Energy Central email and reset your password.

Fri, Sep 7

Understanding Coordinated Voltage and Var Control in Distribution Systems: Is Power Factor = 1 Always a Good Thing?

There is a notion common among many power industry engineers that improving the power factor on the distribution feeders and lowering the electrical losses will also reduce the amount of energy that must be purchased from the energy supplier to serve a given customer load.

In many cases, if not in most of them, it is not true. When a feeder capacitor is switched on, in addition to changing the current flow, the voltage is also changed. It increases. The voltage increase results in an increase of the real and reactive load according to the load-to-voltage dependencies. If the load increase exceeds the loss reduction, the amount of energy supply will increase.

The relationships between the loss reduction and load increase due to feeder capacitor switching is defined by the electrical distance (impedance) between the source and the capacitor, by the loading of the distribution and transmission circuits, by the real and reactive load-to-voltage dependencies, and by the behavior of other voltage-controlling devices.

LetÂ’s consider some examples. Figure 1 illustrates a distribution feeder with two capacitors, one Mvar each. One capacitor is located close to the feeding substation bus (C1), and another is far from the substation bus (C2). The substation transformer has an automatically controlled Under-Load-Tap-Changer (LTC). The distribution system is connected to the source of supply (G) through a transmission system, which transmits loads to the sample substation and to other substations.

Figure 2 presents the change of losses in distribution and transmission system after the remote capacitor C2 is switched ON. The changes of loses are presented for different loading of the distributing and transmission systems. As seen in the figure, the losses are reduced in almost all cases, except the case with very small loads in both systems. In this case, the capacitor generates leading reactive power.

Figure 3 presents the changes in real and reactive loads due to the increase in voltage after capacitor C2 is switched ON. It is assumed in this example that one percent in voltage increase increases the real load by one percent and the reactive load by three percents. As seen in the figure, the load increase is greater the greater is the feeder loading. So, we see loss reduction and load increase after the capacitor is switched ON.

The generation needed to supply the load is equal the sum of the load and the corresponding losses. This sum is presented in Figure 4. As seen in the figure, in this case, the load increase is greater than the loss reduction, and the needed generation is increased.

Another case is presented in Figure 5. Here, the close to the substation capacitor (C1) is switched ON. As seen in the figure, in about half of the cases, the needed generation is reduced.


Similar analyses performed for underground feeders show that the generation increase is either much smaller than in the case of the overhead feeder, or is slightly reduced.

It was assumed in the cases discussed above that the voltage at the distribution bus of the feeding substation is constant. This assumption is acceptable, when the automatic control of the voltage follows a constant voltage setting (band-center) with a small bandwidth. In this case, the controller will compensate for the increase of voltage due to the source impedance.

In many cases, the bandwidth is much larger than the change of bus voltage from switching a capacitor, and the probability that the voltage controller will adjust the voltage is small. In such cases, the results significantly differ from the cases with constant voltage, because the impact of load increase is greater due to the involvement of the source impedance.

Another possible scenario is voltage control at the substation with a strong Line-Drop Compensation. In this case, the voltage controller can react on the reduction of the reactive power flow through the substation transformer and reduce voltage by one LTC step. In these cases, the impact of load increase is reduced due to the voltage adjustment by the LTC controller.

In addition to the situations discussed above, there are situations, when adding a feeder capacitor significantly helps in reduction of supply. For instance, when a capacitor is added to a voltage-critical feeder fed from a multi-feeder bus, it may create room for conservation voltage reduction affecting all other feeder. But this is another story about coordinated voltage and var control, which maybe I will tell in another article.

Conclusions

1. The impacts of feeder capacitor in distribution on the power system operational parameters are different depending on the location of capacitors, loading of distribution and transmission systems, on the ways the voltage is controlled by other voltage-controlling means, and on the real and reactive load-to-voltage dependencies. Switching a capacitor ON can result either in increase of generation needed to supply the load in distribution, or in reduction of it.

2. In many cases, improving the power factor in distribution circuits does not lead to a reduction of the needed supply to serve loads in distribution.

3. All significant impacts of capacitor control on power system operations in correlation with the objectives of voltage and var control should be considered when deciding on implementation of coordinated voltage and var control as a function of Distribution Automation.