home - Vitale Joe
White Gaussian noise. White noise. Physical sources of white noise Gaussian noise

A) white noise .

a stationary random process with a constant power spectral density at all frequencies is called white noise.

According to the Wiener-Khinchin theorem, the white noise correlation function is:

is zero everywhere except for the point
. The average power (variance) of white noise is infinitely large.

White noise is a delta-correlated process. The uncorrelatedness of the instantaneous values ​​of such a random signal means an infinitely high rate of change over time - no matter how small the interval , the signal during this time can change by any predetermined value.

White noise is an abstract mathematical model and the physical process corresponding to it, of course, does not exist in nature. However, this does not prevent us from approximately replacing real sufficiently wide-band random processes with white noise in cases where the bandwidth of the circuit affected by the random signal turns out to be significantly narrower than the effective width of the noise spectrum.

B) Gaussian (normal) distribution .

In the theory of random signals, the Gaussian probability density is of fundamental importance.

(7.2)

Variable substitution
gives:

(7.3)

Here Ф is the probability integral

The graph of the function F(x) has the form of a monotonic curve that changes from 0 to 1.

16..Narrow-band random process. Rayleigh distribution. Rayleigh-Rice law.

We study the properties of narrow-band random signals, in which the power spectral density has a pronounced maximum near a certain frequency , different from zero. Let us define the correlation function of a narrow-band random process.

Consider a stationary random process x(t) whose one-sided power spectrum
concentrates in the vicinity of a certain frequency >0. According to the Wiener-Khinchin theorem, the correlation function of this process

(7.4)

shift the spectrum of the process from the vicinity of the frequency around the zero frequency,
(7.5)

Carrying out averaging using the probability density (7.22), we find the average value of the envelope and its variance:

(7.23)

(7.24)

Having a one-dimensional envelope probability density, it is possible to solve a number of problems in the theory of narrow-band random processes, in particular, to find the probability of exceeding the envelope of some given level.

Random variables distributed according to Rayleigh's law,

The simplest task is to find the one-dimensional probability density of the envelope of the total oscillation. Assuming that the useful signal
, while noise, we write the expression for the implementation of the total process X(t) . This random process is narrow-band, so its implementation can be expressed in terms of slowly changing envelope U(t) and the initial phase
:

In the new variables we have

(7.26)

Now, in order to obtain a one-dimensional envelope probability density, one should integrate the right side of formula (7.26) over the angular coordinate, as a result of which we find:

(7.27)

This formula expresses a law called Rice's law. Note that when
, i.e. in the absence of a deterministic signal, Rice's law becomes Rayleigh's law.

Substituting this expression into (7.27), we have

(7.28)

Those. the envelope of the resulting signal is distributed in this case approximately normally with dispersion and mathematical expectation
. In practice, it is considered that
the envelope of the resulting signal is normalized.

AWGN) - type of interfering influence in the information transmission channel. It is characterized by a uniform spectral density, a normally distributed amplitude value and an additive way of influencing the signal. The most common type of noise used to calculate and model radio communication systems. The term "additive" means that this type of noise is added to the useful signal. As opposed to additive, you can specify multiplicative noise - noise that multiplies with the signal.

see also


Wikimedia Foundation. 2010 .

See what "Additive White Gaussian Noise" is in other dictionaries:

    additive white gaussian noise- Type of interference in the information transmission channel. It is characterized by a uniform spectral density, a normally distributed amplitude value and an additive way of influencing the signal. The most common type of noise... Technical Translator's Handbook

    This term has other meanings, see White noise (meanings). Noise colors White noise Pink noise Red noise Gray noise ... Wikipedia

    Additive white Gaussian noise (AWGN) is a type of interfering effect in an information transmission channel. It is characterized by a uniform spectral density, a normally distributed amplitude value and an additive way of influencing ... ... Wikipedia

    Probability density Green line ... Wikipedia

    Normal distribution Probability density The red line corresponds to the standard normal distribution The distribution function The colors in this graph correspond to the graph above ... Wikipedia

    This term has other meanings, see Signal (disambiguation). Optimal signal reception is a field of radio engineering in which the processing of received signals is carried out on the basis of methods of mathematical statistics ... Wikipedia

    ABGSh- additive white Gaussian noise... Dictionary of abbreviations and abbreviations

9. White noise

9. White noise

  • 9.1. Definition of white noise.
  • 9.2. Gaussian white noise.
  • 9.3. Physical sources of white noise.
  • 9.4. Correlation of processes.

9.1. Definition of white noise

  • A narrowly stationary random process with a power spectral density function equal to a positive constant is called white noise.
  • The name comes from optics, the white color is obtained by mixing waves of different frequencies in the visible range.
  • Usually, in the process of white noise, the mathematical expectation is zero, m = 0.
  • Since white noise is a stationary process in the narrow sense, its autocorrelation function depends on one argument τ;
  • KXX(τ) is even.

9.1. Definition of white noise

  • The spectral density function KXX(ω) is obtained from the autocorrelation function by the Fourier transform, and since the function KXX(ω) is even, the cosine transform can be used.
  • Let KXX(ω) = c > 0. The inverse Fourier transform (or inverse cosine transform) of a constant function is equal to the δ-function with coefficient c

9.1. Definition of white noise

  • Therefore, white noise is an uncorrelated process, random variables X(t1) and X(t2) , that is, their correlation is zero (the other variables are linearly independent) for any. The distribution of the random variable X(t0) in the definition of white noise is not specified, it can be anything.
  • The signal energy is proportional to the integral
  • It follows that white noise does not exist.

9.2. Gaussian white noise

  • Consider a stationary uncorrelated Gaussian process.
  • Let the mathematical expectation of the process a = 0, the root mean square is equal to σ. Then, in view of the zero mathematical expectation
  • If σ tends to infinity, then such a Gaussian process tends to white noise. But in a real application, one has to limit oneself to a specific value of the root-mean-square σ. We set σ = 10 and find the spectral density of such a process.

9.2. Gaussian white noise

  • The Fourier transform of the function KXX(τ) of the Gaussian process can be found by passing to the limit (as ε tends to 0) of the Fourier transform of the rectangular momentum R(σ2, ε, t) (see 3.8. Examples of Fourier transforms).

On the right side, a function is obtained that tends to the spectral density function KXX(ω) of white noise for ε 0.

9.2. Gaussian white noise

  • Graphs of approximation of the spectral density obtained from the Gaussian process at σ = 10
  • for ε = 1, 0.5, 0.1

9.2. Gaussian white noise

  • The function does tend to a constant, but this constant is zero. Nevertheless, on a limited frequency interval, the function can approximately be considered a nonzero constant.
  • Thus, a stationary uncorrelated Gaussian process can be considered as an approximation to white noise. It is really used in practical tasks.

9.2. Gaussian white noise

  • Using the ergodicity property of the Gaussian process, we estimate the autocorrelation and spectral density functions for one realization with a volume of n=1000 measurements.
  • Graph of the implementation of the uncorrelated Gaussian process at a = 0, σ = 10.

9.2. Gaussian white noise

  • Autocorrelation function estimation plot (statistical autocorrelation function) at n=1000 , a = 0, σ = 10.

9.2. Gaussian white noise

  • Graph of the statistical function of the spectral density at n=1000 , a = 0, σ = 10 (the integral was calculated by the method of rectangles, the red horizontal line is the mean value of the function)

9.2. Gaussian white noise

  • Any uncorrelated stationary (enough in the narrow sense) process can be chosen as an approximation to white noise. For example, we can take a discrete process D(t) with two equiprobable states +1 and -1, at the moments t = 0, 1, 2, … the process takes one of these states. (One trouble: if we calculate the correlation of the joint distribution of two such quantities, it turns out that it is not equal to zero).
  • The exercise. Find the joint distribution correlation, process characteristics D(t) (mathematical expectation, variance, autocorrelation function, spectral density function).

9.3. Physical sources of white noise

  • White noise, like the δ-function, exists only as a mathematical abstraction. Both of these concepts arose from natural phenomena, abstract

Normal distribution, also called Gaussian distribution or Gauss - Laplace- probability distribution , which in the one-dimensional case is given by the probability density function , coinciding with the Gaussian function :

f (x) = 1 σ 2 π e − (x − μ) 2 2 σ 2 , (\displaystyle f(x)=(\frac (1)(\sigma (\sqrt (2\pi ))))\ ;e^(-(\frac ((x-\mu)^(2))(2\sigma ^(2)))),)

where the parameter μ is the mean (mean), median and mode of the distribution, and the parameter σ is the standard deviation (σ ² is the variance) of the distribution.

Thus, the one-dimensional normal distribution is a two-parameter family of distributions. The multivariate case is described in the article "Multivariate normal distribution".

standard normal distribution is called a normal distribution with mean μ = 0 and standard deviation σ = 1 .

Meaning

If a certain quantity is formed as a result of the addition of many random weakly interdependent quantities, each of which makes a small contribution relative to the total sum, then the centered and normalized distribution of such a quantity tends to normal distribution.

Properties

Moments

If random variables X 1 (\displaystyle X_(1)) And X 2 (\displaystyle X_(2)) are independent and have a normal distribution with mathematical expectations μ 1 (\displaystyle \mu _(1)) And μ 2 (\displaystyle \mu _(2)) and dispersions σ 1 2 (\displaystyle \sigma _(1)^(2)) And σ 2 2 (\displaystyle \sigma _(2)^(2)) respectively, then X 1 + X 2 (\displaystyle X_(1)+X_(2)) also has a normal distribution with expected value μ 1 + μ 2 (\displaystyle \mu _(1)+\mu _(2)) and dispersion σ 1 2 + σ 2 2 . (\displaystyle \sigma _(1)^(2)+\sigma _(2)^(2).) This implies that a normal random variable can be represented as the sum of an arbitrary number of independent normal random variables.

Maximum entropy

The normal distribution has the maximum differential entropy among all continuous distributions whose variance does not exceed a given value.

three sigma rule

three sigma rule (3 σ (\displaystyle 3\sigma )) - almost all values normally distributed random variable lie in the interval (x ¯ − 3 σ ; x ¯ + 3 σ) (\displaystyle \left((\bar (x))-3\sigma ;(\bar (x))+3\sigma \right)). More strictly - approximately with a probability of 0.9973 value normally distributed random variable lies in the specified interval (provided that the value x ¯ (\displaystyle (\bar (x))) true, and not obtained as a result of processing the sample).

Modeling Normal Pseudo-Random Variables

The simplest approximate modeling methods are based on the central limit theorem. Namely, if we add several independent identically distributed quantities with a finite variance , then the sum will be distributed approximately fine. For example, if you add 100 independent standard evenly distributed random variables, then the distribution of the sum will be approximately normal.

For programmatic generation of normally distributed pseudo-random variables, it is preferable to use the Box-Muller transform. It allows you to generate one normally distributed value based on one uniformly distributed one.

Relationship with other distributions

  • The normal distribution is a type XI Pearson distribution.
  • The ratio of a pair of independent standard normally distributed random variables has a Cauchy distribution. That is, if the random variable X (\displaystyle X) represents the relation X = Y / Z (\displaystyle X=Y/Z)(where Y (\displaystyle Y) And Z (\displaystyle Z) are independent standard normal random variables), then it will have a Cauchy distribution.
  • If z 1 , … , z k (\displaystyle z_(1),\ldots ,z_(k)) are jointly independent standard normal random variables, i.e. z i ∼ N (0 , 1) (\displaystyle z_(i)\sim N\left(0,1\right)), then the random variable x = z 1 2 + … + z k 2 (\displaystyle x=z_(1)^(2)+\ldots +z_(k)^(2)) has a chi-square distribution with k degrees of freedom.
  • If the random variable X (\displaystyle X) is subject to a lognormal distribution, then its natural logarithm has a normal distribution. That is, if X ∼ L o g N (μ , σ 2) (\displaystyle X\sim \mathrm (LogN) \left(\mu ,\sigma ^(2)\right)), then Y = ln ⁡ (X) ∼ N (μ , σ 2) (\displaystyle Y=\ln \left(X\right)\sim \mathrm (N) \left(\mu ,\sigma ^(2)\right )). And vice versa, if Y ∼ N (μ , σ 2) (\displaystyle Y\sim \mathrm (N) \left(\mu ,\sigma ^(2)\right)), then X = exp ⁡ (Y) ∼ L og N (μ , σ 2) (\displaystyle X=\exp \left(Y\right)\sim \mathrm (LogN) \left(\mu ,\sigma ^(2) \right)).
  • The ratio of squares of two standard normal random variables has a Fisher distribution with degrees of freedom (1 , 1) (\displaystyle \left(1,1\right)).

History

For the first time, the normal distribution as the limit of the binomial distribution at p = 1 2 (\displaystyle p=(\tfrac (1)(2))) appeared in 1738 in the second edition of the work

When considering a Gaussian process, it is often convenient to represent it as the sum of its mean function and some noise process with zero mean. In this way,

where is the Gaussian process with zero mean:

In the most interesting applied problems, for example, in the case of shot noise [equality ], the mean function is a known (not random) signal, but a Gaussian noise process, stationary in the narrow sense. Moreover, since the covariance function is equal to the correlation function [see. formula ]:

Thus, the Fourier transform of the function i.e. the power spectral density completely defines the process with zero mean.

In many applications of communication theory, one has to deal with sources of physical noise in which the power spectral density of the Gaussian noise superimposed on the useful signal remains practically constant up to frequencies much higher than the frequencies that are fundamental in the signal itself. In such cases, it follows from equalities (3.115) and (3.116) that the rms value of noise interference can be reduced (without an undesirable effect on the useful signal) by passing the sum of the signal and noise through the filter, the signal leaves the filter without any significant changes, and noise is largely suppressed (Fig. 3.27). Since we are only interested in the power spectral density of the noise at the output of the filter, it seems of little importance what is the spectrum of the noise at the input in the region where it approaches zero outside the filter's passband. In accordance with this, it is often assumed that the input noise spectrum is constant at all frequencies and the concept of white Gaussian noise is introduced, which is defined as a stationary Gaussian process with zero mean

Fig. 3.27. Broadband Gaussian noise at the Ginputs of a narrowband filter. At the output of the filter, exactly the same process appears as if white noise were input.

and with power spectral density

In reality, white noise can only be fictitious, since its total average power must be equal to

which is pointless. The usefulness of the concept of white noise follows from the fact that such noise, when passed through a linear filter, for which

turns at the filter output into a stationary Gaussian process with zero mean, which is by no means meaningless. From equalities (3.114) and (3.132) we obtain

whence it follows that

This quantity is finite by assumption (3.1336). In accordance with equalities (3.120) and (3.134a), the correlation function of the output process

Another derivation of equality (3.125) is obtained directly from the expression for the correlation function of white noise. notice, that

Thus, in accordance with equality (3.111), the process is given to the correlation function

which is also useful in calculations, although it has no physical meaning. It follows from equality (3.1366) that any two sample values ​​of white Gaussian noise are statistically independent, no matter how close to each other the moments of their observation are chosen. In a sense, white Gaussian noise describes the ultimate "randomness". Substituting the expression (3.1366) into relation (3.110a) at , we obtain

Fig. 3.28. Passing white noise through an ideal low-pass filter.

Representing functions as the inverse Fourier transform and changing the order of integration, we again arrive at equality (3.135). The integral on the right-hand side of equalities (3.137) is often called the "correlation function" of the (deterministic) function

As an example of the application of these results, consider the ideal low-pass filter shown in Fig. 3.28, whose transfer function is given as

If white Gaussian noise is fed to the input of this filter, then the function of the process averages at the output is determined by the equality

 


Read:



Amazing facts about Sanskrit, Russian and Sanskrit is the language of the gods Sanskrit whose language

Amazing facts about Sanskrit, Russian and Sanskrit is the language of the gods Sanskrit whose language

Atas, Russian (simple). It is considered just some kind of semi-hooligan exclamation, meaning "Quickly, guys, get out of here!", but Skt. atas adv. from here ....

Secret rulers of the earth. Resourcecracy. Where do the real rulers of the Earth live and what do they eat? Is there someone behind all the evil

Secret rulers of the earth.  Resourcecracy.  Where do the real rulers of the Earth live and what do they eat?  Is there someone behind all the evil

As soon as difficult times come, people tend to immediately seek help from the Gods, in whom they believe, so that they will help in solving difficult ...

What causes Earth's climate change?

What causes Earth's climate change?

Article by Ikonnikov V.A. very big. In fact, this is a scientific study of the "Secret Doctrine" for the presence of facts about the displacement of the earth's axis. Because more...

Emerald Beach Resort & SPA CTS - latest reviews of Emerald beach resort spa 4 Bulgaria

Emerald Beach Resort & SPA CTS - latest reviews of Emerald beach resort spa 4 Bulgaria

Emerald Beach Resort, Bulgaria, Nessebar, August 2018Overall rating - 9.3/10Service - 9Food - 9Accommodation - 10 This hotel has no problems. Rooms...

feed image RSS