Conservation voltage reduction (CVR) is based onthe premise that conserving energy by reducing voltage levels of the electrical system will lead to reduced electrical energy consumption. In this work a comparison-based method has been applied for assessing CVR effects in a distribution grid. The reference voltage on the secondary side of the primary substation was manually altered according to a predefined schedule during 6 months, that is, perform CVR on a substation and apply “normal” voltage to the same substation with a 2-week interval between the changes. Measurements (1 second RMS-values for voltages, active powers and reactive powers) and analysis were made at the 11 kV side of a 130/11 kV transformer of a primary substation, on outgoing feeders and at the 400 V side of 11/0.4 kV distribution transformers downstream of the primary substation as well as on downstream single rural and industrial customers. Active power at different reference voltage levels versus 24 hours divided into 10 minute intervals for reoccurring days of the week were plotted and analysed.
This work confirms the difficulties in obtaining predictable loads over time and to accurately analyse the load composition on all grid levels. Each are needed in order to quantify the effects of CVR and by extension optimising the grid operations without violating the power quality.