Monte Carlo method is a class of computational algorithms that rely on repeated random sampling to compute their results. In uncertainty analysis, the relationship between the dependent variable and independent variable can be described as where X = [x1, x2, ... , xm ] is the vector of the independent variables and Z=[z1,z2, ... ,zn ] is the vector of the dependent variable. h=[ft1(X), h2(X) , ..., hm(X)] represents the function relationship between input variable and output variable. In general, if h is very complex, it is hard to solve the probability distribution of Z applying analytic way. In this situation, the Monte Carlo methods are employed to calculate the discrete frequency distribution which approximately simulates its probability distribution. The essence of the uncertainty analysis is to estimate the statistic properties of Z based on the statistic properties of X and the function h (Fishman, 1996). The most important statistic property in uncertainty analysis is the probability distribution, which is always described by the probability density function (PDF). Probability density function describes the probability density of a variable at a given value (Fishman, 1996). Therefore, the main objective of uncertainty analysis is to estimate the PDF of the dependent variable on the basis of the PDF of independent variable and their relationship function. Monte Carlo simulation is a repetitive procedure: (1) The random independent variable X is generated based on its PDF; (2) According to the relationship function h, the vector Z can be calculated; (3) Repeat (1) and (2), the PDF of the dependent variable Z can be estimated when the sample size (the number of repetition) is large enough. The justification of Monte Carlo simulation comes from the following two basic theorems of statistics: (i) The Weak Law of Large Numbers and (ii) The Central Limit Theorem. Based on the above two theorems, it can be proved that with increasing of sample size, the PDF of the dependent variable obtained by Monte Carlo simulation will approach to that of the population.
7.2 Probabilistic small signal stability incorporating wind farm
The flow chart of the Monte Carlo simulation technique for power system small signal stability analysis with consideration of wind generation intermittence is given in Fig. 6. It is well known that the uncertainty of wind generation is due to the uncertainty of wind speed, so we begin with the probability distribution of the wind speed. Fig. 7 shows a Weibull distribution function of wind speed with k = 2 and c =10. When a random wind speed is generated, the mechanical power output extracted from the wind can be calculated via a king of wind turbine model usually given by functions approximation. If the wind speed Vm is less than the cut-in speed Vcut-in or is larger than the cut-off speed Vcut-off, the wind farm will be tripped. If the current wind speed belongs to the speed range from cut-in to cut-off, the wind farm will be kept connected to the grid in power flow calculation and small signal stability analysis. The process is repeated until the pre-set sample size N is
reached. Finally, the probabilistic-statistical analysis can be conducted based on the results from different wind speed conditions mentioned above to reveal the impact of wind generation intermittence on power system small signal stability.
Was this article helpful?
Renewable energy is energy that is generated from sunlight, rain, tides, geothermal heat and wind. These sources are naturally and constantly replenished, which is why they are deemed as renewable. The usage of renewable energy sources is very important when considering the sustainability of the existing energy usage of the world. While there is currently an abundance of non-renewable energy sources, such as nuclear fuels, these energy sources are depleting. In addition to being a non-renewable supply, the non-renewable energy sources release emissions into the air, which has an adverse effect on the environment.