Challenges in the world of solar
The solar industry is growing at an exponential rate and technological advances put us in better positions to make use of the renewable energy. Phil Kreveld writes about what the new legislation aims to do.
The world of solar compared to those early days two decades ago, is unrecognisable. One household in four has rooftop solar and the combined output of solar systems in distribution networks will soon match coal-fired generation capacity.
ADVERTISEMENT
There is no stopping the growth, neither in numbers, nor in power ratings. The influx of solar generation is not without associated problems.
Distribution networks are straining to accommodate distributed energy resources (DER), principally rooftop solar. Ben Barr, heading the Australian Energy Market Commission (AEMC), put the ‘cat amongst the pigeons’ earlier last year when he suggested that solar owners might have to bear a charge for their system’s use of network.
The allusion to a network charge has been met with vigorous opposition. However, something has to give. Networks having to host ever increasing DER, cannot avoid associated capital expenditure. Voltage problems are well known but others include neutral voltage buildup and congestion in feeders.
Reverse power flow and phase imbalance
The bulk of solar inverters are single-phase, and their disposition around the phases is not balanced resulting in neutral current contribution. They are mainly uncontrollable, pushing out current that if not ‘sunk’ by their loads, is flowing back towards the nearest distribution transformer.
The popular view is that voltage problems are due to solar system installations. There is some truth in this, but an important factor is the growth in networks and therefore the need to pump up voltage at substations.
This, in combination with reverse power flow is increasing voltage. It’s a ‘double whammy’, causing irritation to solar system owners because their inverters switch off as voltage reaches about 254V and for the networks in having to spend money to deal with voltage regulation.
New inverters will have to conform to the latest edition of AS 4777.2 which came out in December 2021. The standard, together with AS 4755 provides for the control of current, as well as leading and lagging reactive current based on network instructions, e.g., via 4G and other means. That still leaves the majority of inverters as being non-conforming.
Network problems will not disappear with the implementation of these standards. Voltage control will continue to bug distribution networks as well as phase imbalance, neutral voltage buildup and congestion as already mentioned.
Batteries are seen as partial saviours because they can absorb exported energy. But there is a larger problem we are running headlong into, and it is not a distribution network problem perse. It is reverse power flow at zone substations, i.e., reducing inflowing power in transmission grids.
Transmission lines are combinations of series resistance and inductance and parallel capacitance as well as conductance. As they become ‘unloaded’ the capacitance in combination with inductance builds up the voltage, making it a problem for the Australian Energy Market Operator (AEMO) to control the voltage to the required value. The longer the transmission line, the worse the problem can become.
Batteries can diminish energy export by solar systems, reducing reverse power flow or avoiding the situation entirely but self-reliance of domestic, commercial and industrial consumers, reduces the power inflow at zone substations just as well.
Therefore, the problem of voltage build-up in its connecting transmission line, the Ferranti effect, is not avoided. It’s not hard to imagine, given the solar installation increases encouraged by state governments, that distribution networks are heading for independence from the transmission grid—were it not for voltage and frequency stability considerations.
Grid following inverters, frequency and phase jump stability
All inverters, big or small (except for two in South Australia) are grid-following. They therefore require stable voltage and frequency to operate. Because this point may be lost for some here is an explanation: The switching of inverter transistors in order to provide AC current, is achieved by an internal, voltage-controlled oscillator in combination with a high frequency carrier signal. This provides the switching frequency for the transistors.
The voltage-controlled oscillator is phase-locked to the network voltage thus ensuring that current export is in lockstep. It can be in phase, leading or lagging but phase angle differences are tightly maintained. Some inverters can operate independently of the grid, for example some solar-battery systems. In case of network failure, the voltage-controlled oscillator runs without the phase locked loop.
It will be evident that without grid voltage and frequency stability, grid following inverters would either not function because of voltage limits being exceeded—or being too low, or that frequency instability could result in individual inverter phase-locked loops (PLL) responding non-uniformly. That could set up voltage oscillations. Under tight frequency control, oscillations due to PLL cannot happen. A phase jump, due to a large load being disconnected or connected elsewhere in a network can cause the PLLs to get ‘confused’, shutting down the inverters.
The new standard aims to enlist rooftop solar for the grid
A change of mindset is urgently needed and the new edition of AS 4777.2 makes this clear. Inverters are increasingly treated as participants in the national grid, rather than as only a piece of household or commercial/industrial investment for energy cost containment.
Therefore, power export, total power production and lagging/leading var production as well as soft and hard anti-islanding are all part of the revised Australian Standard. The total inverter output as a controllable item (i.e., export plus consumption) is a development brought about by the requirement for voltage and frequency stability.
As zone substations approach minimum inbound power, automatic generator control (AGC) of coal and gas-fired generation becomes more difficult. But there are other aspects as explained below.
Electrical power systems rely on stability by matching generated power with consumed power, instant by instant. Whenever there is a mismatch, voltage collapse or high instantaneous currents cause the system to go ‘black’. This applies to all manner of generation technology.
For synchronous generation, an enormous amount of experience has been accumulated but not so for inverter-based resources (IBR). The latter do not have the capacity to feed faults like synchronous generators and do not respond to voltage and frequency variations in the same way. All this was of little import when IBR were a small proportion of generation capacity.
Now that large scale renewable energy (VRE) of wind and solar together with DER in distribution networks are matching synchronous generation capacity, and will soon overtake it, there is renewed attention on the distribution networks to ‘manage’ power flow. The new standard reflects this focus.
A clash of interests—hosting more solar will cost users or networks
A clash of interests is in the making. State governments are encouraging installation of solar and batteries without having regard to the effects on networks. As already mentioned, voltage control is becoming increasingly difficult and the more ‘hosting’ of solar inverters by networks is required, the more onerous the task of voltage regulation, phase balance and limitation of neutral voltage buildup become.
To add to this, the discussion papers, ‘Towards Consumer Centric Networks Proposals’, put out by the Australian Energy Regulator (AER) and its ‘Export tariff guidelines for distribution network export tariffs’, make for a conclusion that network expenditure to accommodate more solar will be met with stiff ‘regulatory resistance’.
The AER is also clearly on the side of the consumers and do not want them slugged with increased network charges, arguing rather ‘subtly’ that somehow networks can benefit by hosting more solar. In many ways, distribution networks are the ‘meat in the sandwich’ and can be expected to become increasingly more demanding of new network connections. Expect this first for commercial and industrial installations. In due course this will also happen for domestic installations where ratings of solar often exceed the native demand of the installation.
The bulk of installed inverters are basically uncontrollable, and it remains to be seen to what extent newly imported inverters will be in compliance with the new Australian Standard. The other unknown are the ways in which demand responsive enabling devices (DRED) will receive and process instructions. It is clear as to how DREDs communicate with inverters but that is only the last phase of control. A bunch of options are on the table; WiFi, Modbus, IEC2030.5, advanced metering infrastructure (AMI) and other communication channels, e.g., 4G.
Whatever methods are chosen, expect a large degree of non-uniformity between the distribution networks. Solar PV installers should count on there being increasing resistance to larger single-phase systems as these will cause voltage instability and power quality problems in low voltage circuits.
The influence of solar inverters on network voltage
Voltage regulation in a network is mainly determined by source impedance. The incomer at a zone substation has low source impedance (another way of putting this is by way of it having a very high prospective short circuit current). The incoming network is considered a ‘stiff’ network, with voltage varying very little with power variation.
As one ‘travels’ further down from the zone substation towards the ‘edge’ (low voltage reticulation) of the network, impedances increase, becoming mainly resistive. Voltage at zone substations has been on the increase for two reasons; one to serve increasingly large lineal distance networks and equally important, to service larger power flow without having to use more copper as power increases as the square of voltage.
Solar inverters are essentially ‘controlled current’ devices. Their power output is determined by whatever the voltage is at their terminals multiplied by the current they push out. The source of voltage for the average suburban solar inverter is its nearest 11kV/400/230V transformer. The source impedance of this transformer is much lower than the ‘aggregate source impedance’ of the collection of solar inverters connected to it. Therefore, the voltage rise at the transformer is very small if any. However, power can flow through the transformer in the reverse direction (from the low voltage to the high voltage side). If this sounds odd, remember that in a transformer, the primary and secondary amp-turns of primary and secondary windings is a constant—increase in value or change in flow direction is balanced by the other winding.
Mode | Requirement |
DRM0 | Operate the disconnection |
DRM1 | Do not consume power |
DRM2 | Do not consume more than 50% of rated power |
DRM3 | Do not consume more than 75% of rated power AND supply reactive power if capable |
DRM4 | Increase power consumption |
DRM5 | Do not generate power |
DRM6 | Do not generate more than 50% of rated power |
DRM7 | Do not generate more than 75% of rated power AND absorb reactive power if capable |
DRM8 | Increase power generation subject to constraints from other active DRMs |
At the inverter terminals, and in particular if connected to long service line, the voltage can rise and fall markedly because of the relatively high resistance of the line. Contractors will be aware of nuisance high voltage limit-anti-islanding of inverters, only to reconnect because voltage has dropped at its terminals—and then the process repeats to the considerable annoyance of the solar system owner. As to the primary side of the transformer, it can be on a load bus servicing other three-phase to single-phase transformers, which are meeting load current requirements of consumers without solar systems. However, this is an increasingly unlikely situation and therefore feeders are experiencing reduced current and even reverse current therefore raising voltage at load buses during periods of high insolation. Under reverse current conditions, a similar process to that described for low voltage secondary transformers takes place, i.e., the voltage at the zone substation does not change representing a very low source impedance.
Review of AS/NZS 4777.2 2020
There are major revisions in the new standard (see inset 1) in particular in regard to the operation of inverters with DREDs (demand responsive enabling devices). The only current mandatory one is DRM0. However, as can be seen from inset 2, the various DRM modes are also a part of AS/NZS 4755. This standard relates to consumption devices and apparatus, some capable of generating power, i.e., electric vehicle charge/discharge to the grid. These will come into force at various years and therefore the complexity for grids to provide for controls within its networks is very likely to increase. Other changes to the old standard include:
- Revision of sustained frequency response
- Revisions to meet network requirements (in particular voltage)
- Provision of demand response and power quality
- Withstand capabilities including voltage, rate of change of frequency (RoCoF) and phase shift
- Stand-alone inverters and revised testing procedures
Voltage response of inverters is a very important feature of the new standard including volt-var and volt-watt response. The volt-watt response operates between 253 and 260 volts, line to neutral, requiring a linear drop off from 100% of maximum inverter power to 20% at 260 volt.
Volt-var response operated from very low line voltage of 207 volts to a high limit of 258 volts. Inverters have to be capable of supplying lagging (absorbing) and leading (supplying) vars. The terminology is confusing so a brief explanation herewith. Mostly networks supply power to loads with lagging power factor; i.e., the current lags the voltage by some angle, the component of the current at 90° multiplied by the voltage, being the ‘absorbed’ vars. Inverters when required to absorb vars, draw a 90° lagging current component while delivering real power to its load and possibly exporting some power. Thus, when an inverter is absorbing vars, it will supply any vars required by the load, and draw excess vars from the network.
When an inverter has to supply vars, it is supplying a current with a leading reactive component. Supplying or absorbing vars affects the feeders of a network because of their inductance. Were it not for that, volt-var response wouldn’t be value. In feeders, the voltage drop is the result of resistance drop and inductance drop. If voltage is high at a load point, drawing lagging (absorbing) vars, reduces voltage. If the voltage is low, in part because of voltage drop on the inductive part of the feeder, reducing that by counteracting the drop by supplying leading vars, will raise the voltage.
The volt-watt response is easy to understand intuitively. It is actually the in- phase component of current delivered by the inverter which determines watts as inverters have no control over line voltage. It is doubtful that inverters incorporate watt meters. A smart meter with dual elements could do the job. However, reducing current when voltage is high is an obvious solution to getting the voltage down. The volt-watt response operates between voltages of 253 and 260 volts.
A word of caution regarding voltage in distribution networks. As customers go for bigger rating solar systems and export increases, networks are running out of voltage control measures. The usual method of voltage control is at an on-load tap changing (OLTC) transformer, supplying a bunch of feeders. An automatic voltage regulator (AVR) senses voltage rise above or below the desired reference value and adjusts taps in the primary winding to achieve the desired voltage. Obviously, this takes account of the voltage drop in feeders at load points. On some AVRs a line drop compensator makes adjusts the reference voltage to allow for load current variation. However, with increased reverse current flow, the transformer runs out of taps. The line drop compensator senses reduced feeder current, and adjusts the reference voltage of the AVR to a lower value. Initially this might result in customers on non-solar feeders ‘enjoying’ a lower voltage than usual, but eventually the bottom tap is reached.
Harmonic emissions
Harmonic emissions are specified as voltages but as inverters supply current, a standard impedance has to be used of 0.24W + j0.15W (note the ‘j’ infers inductance). Permissible harmonic voltages are shown below.
Harmonic order number | Limit based on %-age of fundamental |
3 | 0.9% |
5 | 0.4% |
7 | 0.3% |
9 | 0.2% |
Even harmonics 2-10 | 0.2% per |
11 to 50 | 0.1% |
THDV to the 50th | 5% |
Note: THDV (Total harmonic distortion for voltage) is the all-important number. Up to now harmonic emissions haven’t been a problem for networks. However, with a possible increase in power line communication to for example, signals to DERs, harmonic distortion will be taken more seriously, for example emissions between 11th and 50th orders.
Voltage phase and RoCoF withstand
This will turn out be a tough test for some inverters. The inverter must stay connected for instantaneous phase shift of 60° and also a rate of change of frequency (RoCoF) of ± 4.0 Hz/sec for 0.25 secs.
Conclusion
Not so much to put the ‘frighteners’ on the solar industry, but to make it aware of the problems developing with distributed energy resources in networks, the time is now to take into account increased network opposition to solar installations. This can take the form of pesky performance tests, hitherto not applied because they were too cumbersome, but likely to be revived to slow down connections. The facts are that connection of rooftop solar has become a big problem to AEMO one they don’t comment on publicly. Problems with voltage in distribution networks are only part of the difficulties and there are no easy fixes. Unpopular measures already foreshadowed include making solar owner pay for being connected to the grid and curtailment. The latter is likely to occur within the next few years in order to provide voltage and frequency stability in high voltage networks.
Other applications of the standard
Electric Storage Water Heaters (Resistive Heating) Compliance with demand response mode 1 (DRM1) to be required, for electric storage water heaters of 50 to 710 litres AS/NZS 4755.3.3:2014 or AS/NZS 4755.2 (when published). registered after 1 July 2023.
Devices controlling swimming pool pump-units AS/NZS 4755.3.2:2014; or AS/NZS 4755.2 (when published). Compliance with demand response mode 1 (DRM1) to be required, for pool pump controllers supplied or offered for supply from 1 July 2024. Compliance with DRM1, DRM2 and DRM3 to be required for pool pump controllers supplied or offered for supply from 1 July 2026.
Electric Vehicle Charge/Discharge Controllers capable of managing the charging and/or discharging to the grid of EVs, that are intended for residential applications and capable of charging at SAE Level 2 or IEC Mode 3. Compliance with AS/NZS 4755 DRMs 0, 1,2,3,4,5 and 8 to be required (6 and 7 optional).
-
ADVERTISEMENT
-
ADVERTISEMENT