Signal Theory - personal.rdg.ac.uk
Download
Report
Transcript Signal Theory - personal.rdg.ac.uk
CY2G2/SE2A2 Information Theory and Signals
Aims: To discuss further concepts in information theory and to introduce
signal theory.
Outcomes: An ability to quantify information transfer in noisy channels,
continuous sources and communication systems. An understanding of coding
to minimise errors and of fundamental concepts in signal theory.
Book: M.J.Usher &C.G.Guy, “Information and Communication for
Eengineers’’ Macmillan (Strongly recommended)
Information Theory :
Review of fundamental information theory, Matching source to channel,
Information in noisy channels, Coding in noisy channels, Shannon’s second
theorem, Coding methods, Information in continuous sources, Idea
communication theorem, Implications and applications.
Signals ---- 1
1
CY2G2/SE2A2 Information Theory and Signals
Signal Theory :
Random noise and its properties, Introduction to signal theory, time domain
properties, frequency domain representations, Autocorrelation, Cross
correlation, convolution and their properties, Fourier series, application to
simple waveforms, complex form and application to pulse train, deductions
and implications. Theory and properties of Fourier Transforms, and their
applications including autocorrelation, power spectrum, convolution and linear
systems, and sampling theory.
Course work: Laboratory practicals
400 Electrical noise
418 Correlations and Convolution
417 Binary codes
419 Fourier Series
Examination:
CY2G2: Four questions in one section --- answer at least three.
Please SE2A2 module description for division of exam questions.
Signals ---- 1
2
•Noise
Random noise occurs in any
practical information transmission
system. Its value at any instant is
unpredictable.
Received signal = Actual signal +
Noise
Random noise is an unwanted
signal, producing errors or changes
in amplitude of the actual signal
(wanted), and reducing the overall
information transfer.
Signals ---- 1
3
Most signal would have some kind of pattern, or regularity, or the plot of signal
is smooth, e. g. sine wave.
noise is unpredictable in value, as next time step could take any value in its
range. The plot is not smooth.
Received signal = Actual signal + Noise
This results in signal in a mixed form. The predictability depends on the S/N
ratio, the ratio Var(S)/Var(N).
Random noise is an unwanted signal, producing errors or changes in
amplitude of the actual signal (wanted), and reducing the overall
information transfer.
The noise level is represented by its variance or standard deviation. Given a
signal, a noise with higher variance would corrupt the received signal to a
higher extend than a noise with a lower variance.
Signals ---- 1
4
x
P(x)
x
P(x)
Signals ---- 1
5
•Properties of noise
(i) Time domain
The essential feature of electrical noise is it is unpredictable in the time
domain. Its amplitude follows a Gaussian distribution.
1
v2
p( v )
exp( )
2
2σ
2
mean
mean square
v 0,
v2 σ 2
Signals ---- 1
6
Frequency domain is a commonly used method of signal processing. Frequency
response is used to describe a systems characteristics using its response to
sinusoidal signal. If a sine wave is fed into a system (input), the output will also
be a sine wave, but with different amplitude and usually have a phase shift. By
changing the frequency of input signal, the system can be shown as how its
amplitude and phase changes accordingly, which then defines a system.
Frequency response: A power function (corresponds to amplitude) versus
frequency plot is used. For instance a sinusoidal signal with a fixed frequency
would be plotted as a peak at a certain frequency. A sinusoidal signal is totally
predictable.
Volts
V(t)= 5sin(60π t)
5
30Hz
Signals ---- 1
frequency
7
(ii) Frequency domain
Unpredictability in the time domain Flatness in the frequency
domain. (No peak means no periodicity, which means predictable)
Noise can be compared to white light which has a flat frequency plot in
spectrum.
Signals ---- 1
8
(iii) Representation
Represent each slice by a sinusoidal oscillator with frequency
Equal to that of centre of slice:
fi
i
2
ai cos( i t i )
1 2
where
ai P0fi is mean power ,
2
i being random phase
We have infinite number of sinewave generators with same amplitude
ai, random phase, Φi . So, the noise wave form is given by
v( t )
a
i
cos i t i
i
( 2 P0 f1 )
1
2
cos i t i
i
Signals ---- 1
9
(iii) Addition of random generators
Signals ---- 1
10
Instantaneous sum
v( t ) v1( t ) v2 ( t )
Mean squares value
v v v 2v1v2
2
2
1
Since two wave forms are independent,
2
2
2v1v2 0
v v v
2
2
1
2
2
So the mean squares of the two noise adds up as the mean squares of v(t)
Signals ---- 1
11
Note that for signal of same frequency and phase, as shown below
s2
s2
s12
s12
s22
2
s22
Signals ---- 1
2 s1 s2 0
The root mean squares add as
The root mean squares of s(t)
12
Example:
Noise : v1( t ) 2Vrms , v2 ( t ) 3Vrms
v( t )
2 2 32
13 3.6Vrms
signal : s1( t ) 2Vrms , s2 ( t ) 3Vrms
s( t ) 2 3 5Vrms
Signals ---- 1
13
Homework:
In Matlab:
>> a=3*randn(1000,1);
% generate a random Gaussian series of 1000 points, with standard
deviation 3.
>> b=4*randn(1000,1);
% generate a random Gaussian series of 1000 points, with standard
deviation 4.
>> c=a+b;
% generate a random Gaussian series of 1000 points as the sum of the
above 2 series.
Signals ---- 1
14
>>
sigma_a=std(a);
% find the standard deviation of a.
>> sigma_b=std(b);
% find the standard deviation of b.
>>
sigma_c=std(c);
% find the standard deviation of c (the sum of a and b).
>>
plot(a);
% show you the random series as a figure.
>>
sigma_a
% show you the standard deviation of a, should approximately be 3.
>>
sigma_b
% show you the standard deviation of b, should approximately be 4.
>>
sigma_c
% show you the standard deviation of c, (note that c should
approximately be 5= 32 4 2
, but not 7).
Signals ---- 1
15