Conservation Voltage Reduction: A No-Brainer Smart Grid Benefit?

Fifteen years ago the electric utility industry, including the company I worked at, was in the middle of “deregulating”, “restructuring”, “delaminating”, whatever the trendy name of the the spasm du jour. Technical staffs were happy to let the executive floor and the lawyers deal with that chaos – they had a tough enough time keeping the lights on with shrinking budgets. Predictably, life among the cubicles generated plenty of cynicism and sarcastic humor. For example, “Undead” was the name given to recurring inquiries by the state utility commission or other agencies that had regulatory influence. Usually these inquiries originated with a recently hired staffer reviewing our rate case application, or an intervener’s testimony before the commission. Whatever the source, the “Undead” were issues that had been repeatedly asked, discussed and answered ad nauseam (ad boredom) by our utility’s technical staff to the point where a junior engineer could usually just resurrect last year’s response from the files and send it to our law department for transmittal to the appropriate state agency.
 
Chief among the “Undead” were inquiries regarding Conservation Voltage Reduction (CVR). It’s been known for decades that slightly reducing the voltage to induction motors (as in air conditioners, appliances, industrial applications) can often save energy without affecting performance. The down side is that if you lower the voltage a little too much, or raise it a tad too high then the motor’s efficiency goes down and you use more energy than if you’d left it at its rated voltage.
 
The explanation of all this is something only an electrical engineer would love (or even tolerate) so you’ll have to trust me, search Google or send me an email.
 
Anyway, the “Undead” question is: why not provide customers a slightly lower voltage, say 114 Volts rather than 120 Volts, and save some energy? For years the resurrected, recycled answer was usually that distribution line voltages are controlled by the substation and they get lower as you move further out along the distribution line. Lower the voltage near the substation and you get unacceptable voltages at the tail end. Raise the tail end voltages and you get unacceptable voltages nearer the substation. Oh sure, you can use capacitors along the line, switching them in or out to flatten the voltage profile but until recently there hasn’t been available technology to measure, let alone control, real-time customer voltages anyway.
 
Well, that limitation may be going away. With Smart Grid technologies we can monitor customer voltages, and, conceivably optimize distribution line voltage profiles for maximum CVR energy savings. How much savings? Maybe 3 percent. That's a lot - it’s over twice the energy provided by solar power installations in the U.S. And, since we’re talking demand reduction, benefits don’t rely on weather or time of day, and you don’t have to figure out how to store it.
 
CVR is just one example of how efficiency and conservation strategies can be enabled by Smart Grid technologies and given new life. They may not be as sexy as renewable generation resources, but in the short term they may be faster and cheaper to deploy.

Comments

CVR - A Misunderstood Concept

I work for a utility company, and have been involved with CVR for quite some time. After reading comments from others, I would like to correct some mis-understandings about CVR.

First of all, there is a relationship between voltage and power consumption. We found the average change in power for residential circuits was about 0.7 the change in voltage. So, a 1.0% decrease in V resulted in 0.7% decrease in P. Commercial circuits (higher concentration of motor loads) had less than 0.7 change, in some cases almost no change in P even as V was varied. However, most circuits did exhibit some reduced consumption when voltage was lowered, because loads are a mixture of constant-impedance and constant-power elements.

The second misconception is that the voltage can't be lowered and still maintain adequate voltage at the end of line. Not all circuits are voltage limited, especially those in urban areas. A study of the circuits can reveal which circuits are good candidates for CVR because their distribution voltage drop is less than average.

A third CVR misconception is that it can't be applied on distribution circuits which are limited by voltage drop during peak conditions. You can still apply CVR and lower the voltage during off-peak conditions while maintaining the higher voltage at peak. This is done using the resistive compensation feature of the LTC controller. Adding R compensation will increase the voltage as a function of the MW loading on the transformer. Therefore, you can set up the LTC or regulator control to maintain a higher voltage during peak conditions and it will back off to a lower value during off-peak.

The regulators who ask a utility about CVR often don't understand that most of the potential CVR savings are off-peak, especially for voltage-limited circuits. Therefore, CVR should be thought of as an energy-saving measure primarily, and has much less impact as a peak-reduction strategy.

REsearch into Effectiveness

What research has been done into the benefit of reducing load voltages on the utility system? When I worked for a utility about 30 years ago, the research at that time showed the effect of voltage reduction was masked by the normal load variation on the system. Perhaps the changing nature of loads has changed which changed the result.

As to line voltage regulation for improving load efficency, there are several issues that that must be addressed. The line from a substation has a voltage sag as one goes from the substation to the end of the lines. By including current compensation in the voltage regulation control, the voltage regulator in the substation will attempt to keep the voltage at the "load center" constant that caused the voltage to rise news the substation and fall away from the station as Jerry has described. With the ANSI voltage range being approximately +6% and -12%, the utility will have a difficult time convincing regulators that improving the regulation to substantially better than requirements of the standard is a worthwhile expense that should be in the ratebase.

Even if you could get the line regulation to within 1% (or less) of the setpoint, what is the voltage for maximum efficiency? Not all loads may have the same voltage for peak efficiency. What helps one customer may be adversely affect the next customer's load. Does operating the load at maximum efficiency also reduce the line losses or will the line losses change and mask the load efficiency changes?

While the concept is good, we need some research to document how well this will help or hurt the efficiency as measured at the substation.

Paul Mauldin's picture

Conservation Voltage Reduction

Good comments all!

For those who want to dig more into this, go to the DOE report: http://www.pnl.gov/main/publications/external/technical_reports/PNNL-19596.pdf

As far as the regulators go, in California both the PUC and the Energy Commision tended to push the utilities to try this. It was the utilities that resisted because of the reasons mentioned.

Paul

jerrym's picture

Conservation Voltage Reduction

If our power grid was an ideal delivery system, this idea MIGHT work. In the reality **I** live and work in, it isn't and it won't. The reason is simple. Most power companies, at least the ones I deal with here in the South, don't maintain a line voltage anywhere near the tolerances we're talking about here. We're talking a 5% reduction. My power companies consider anything +/- 10% to be "within tolerance". I have had fights with them over the voltage supplied to my points of use. When I tell them I need line voltage regulated to +/- 5%, they tell me it isn't going to happen. And I happen to be just downstream of some pole-mounted power line voltage regulators.
And frankly, I think the whole concept of trying to squeeze out an extra couple of percentages of efficieny in the end-user's equipment is NOT the purpose of the power grid. The purpose of the power grid is to deliver clean, regulated power at INDUSTRY STANDARD voltages.
I'm willing to bet that not ALL motors in service will turn out to be more "efficient" at slightly lower voltages. If they don't, it means that, if the supply voltage drops, they draw more CURRENT to make up the required power difference. This will cancel out the benefits of the motors that WILL be more efficient, because whenever you draw more current, you automatically incur the added "I squared R" losses of those motors and their circuits.
If you REALLY want to increase motor efficiency, then do it on the EQUIPMENT MANUFACTURER'S floor. Don't try to accomplish that task with a tool not designed for the purpose. Keep the power grid at "industry standard" voltages, so we can avoid having to deal with voltage problems everywhere.
And finally, I understand fully your concept of the "undead" when dealing with the Government. You are not alone! My industry (broadcasting) has to deal with the SAME insanity from our elected officials and regulatory agencies. Politicians are now so delusional and full of themselves they think they can re-write the laws of Physics. Hence we have Rules for our Industry that are illogical, politically motivated, and ultimately unworkable.

Demand reduction

I love it!! I think of Occam's Razor. The same is true in my field of commercial electrical systems design. Lighting occupancy and daylight harvesting sensors are getting cheaper, more reliable, and they are easy to operate.