|
What is the difference between computer digital
signals and T.V. analog signals?
|
Question Date: 1998-01-23 | | Answer 1:
Suppose you had a page from a book. An analog
signal is like sending a photograph of the page
while the digital signal is like sending the
letters which form the sentences in the book.
Which do you think will be more accurate if there
is noise ? Does that tell you why most of the
computers we use are digital. (There are computers
which use analog signals and now there are TV's
which use digital signals!)
| | Answer 2:
I am not sure which digital signals you are
referring to-- so I will make a guess.
A
computer outputs a signal to drive its monitor
which is different from the video signal used to
transmit television pictures. In a sense, these
two signals are not too different, since both the
television and monitor scan their screens and
display the information (recorded in the signal as
changes in signal strength) as changes in the
brightness of the scanning dot on the screen. (You
can see the scanning lines on a tv by looking at
it through a repetetive shutter such as a spinning
disk with holes cut in it). However, a computer
monitor must make a display that is much sharper
than a TV set, so that lettering can be easily
read. So it separates the Red, Green, and Blue
signals into separate wires, and thus controls the
screen directly. (All of the colors you see are
mixed from these three components on the screen).
Green is used instead of Yellow to increase the
brightness of the blue component of the screen.
The high resolution (large number of lines
possible on the screen) can cause these signals to
contain very high frequencies (>100MHz).
In
a conventional video TV set, all of the colors are
mixed into a Black and White signal (so that the
video is compatible with B&W TV's) and a color
difference signal which is offset by 3.579545MHz.
Since the resolution of TV is low, there is not
much energy in this band anyway. (However, if
several narrow lines are placed together
vertically, the scanning dot going by can make
frequencies this high... leading to the false
color bands seen in some television test
patterns). Essentially, the BW signal tells how
bright to make the display, and the color
difference tells what color it should be. This
trick allows only 1 wire to be used, making it
possible to broadcast the signal as a conventional
T.V. signal. The tricky color technique and the
use of interlacing (two successive frames are
shifted and contain different data) allow the
total video signal to be packaed into 4.5Mhz,
including the color parts. This number was
required to be compatible with earlier TV sets
which used simpler circuits. (Remember that TV was
invented log before transistors became common --
let alone "chips".)
If you are interested
in more information on this, please check for NTSC
(the video standard) and/or RS-170 (one of the
computer standards) in your library or web access.
Another good source are books such as the TV/Video
experimenter-- However: If you experimentwith a
T.V. or monitor, please be advised, a magnet
produces interesting effects -- but will
permanently ruin your set! Also, never open a set
-- even when off, as the anode voltages in a T.V.
commonly run in the 18,000 to 28,000 volt range
and can remain even when the set is off! | | Answer 3:
I think that there are two ways to answer your
question, "Not much" and "Just about everything".
It all depends on which question you are really
asking. The explanation that goes with your first
answer goes like this:
Both digital and
analog signals server the same purpose, they take
some information (like a picture on the tv or
computer screen), encode it (I'll explain what
that means in a second), send it somewhere else
(like from the tv station to your house or from
your computer to your friend in France), and
decode it (the opposite of encoding).
Now
what do I mean by encode? Let me give you and
example. Let's pretend that I am lost in my car
and call you on my cell phone. All I can tell you
is that I want to get to Burger King and that I am
right now parked at the corner of Main St. and
First. You have a map that will show me how to get
there, but I can't see your map because I'm in my
car all the way across town! What we need to do is
to transmit the information on the map from you to
me. The most complete way to do that is for you to
come find me and give me the map, but that is very
time consuming and you might have something else
you want to do with your time (like go watch
"Titanic" or "Amistad").
Another solution
is for you to "encode" the information on the map
into words, send those words to me through my cell
phone, and let me decode your words into the
information I need; ie. Take Main north for three
blocks, turn East onto Fourth and go six more
blocks, Burger King is on the left. It doesn't
even matter what language you "encode" the
information into, as long as we both speak
it.
Now back to your question. Lets say the
information is a trailer for the re-release of
Star Wars. If I wanted to watch it on my computer
it would go like this: some helpful guy who works
for George Lucas would encode the trailer
digitally and put it on the Star Wars web site, I
would find that web site and trasfer the encoded
(digital) information to my home computer, then my
computer would decoded the signal and play the
trailer for me. For tv it would work like this:
some helpful programming executive would decide to
run the trailer tonight at eight, when eight
o'clock comes around he would encode the trailer
in an analog signal that he would broadcast to my
tv (in my case through a cable but it works by
antenna too) then my tv would decode the signal
and show me the trailer.
So you can see,
that is a really long way of saying "Not much".
Both digital and analog are simply differnt ways
of encoding information to be transmitted. And
that brings me to the explanation for your second
answer:
Taken another way, the answer to
your question is quite different, because digital
and analog are very different in the details.
Using my example from above about encoding, if we
were both bilingual you could have "encoded" the
information on the map into either English or
Hindi (a language spoken in India and the
durrounding region). Now, in theory the two
languages serve an identical purpose, transferring
the information about the map from you to me; but
when we get right down to it spoken English and
spoken Hindi don't sound a whole lot alike! In the
same sense digital and analog can be thought of as
two different languages, with a different
vocabulary and grammar.
In digital
encoding all of the information is broken up into
seperate chunks, that is where it gets its name,
digital is another way of saying chunky (not
chunky like Roseanne Barr, chunky like beef stew);
another word commonly used to describe chunkiness
is "discrete" (again, that's discrete like chunky,
not discrete like someone who can keep a secret).
A good example of something that is discrete is a
flight of stairs: you are either on one step or
the next, but you can't stand anywhere in between
(if you only had one leg anyway). These discrete
chunks can be represented by 1's and 0's, and
every piece of information has it's own special
"translation" in 1's and 0's. For instance the
number 13 would be 1101 (if you are interested in
learning more about how I "translated" that number
ask your teacher for a good book about changing
bases between number systems). Now when two
computers talk to each other digitally they can
transmit huge amounts of information, but they say
it all in 1's and 0's, never 2/3, or 143.976
(that's where the discreteness comes in).
Now, analog encoding is what's called
continuous. That means that a slice of analog
information can have any value it wants, 5/7,
345.34, the square root of 2. A slide is a good
example of something that is continuous: you can
be half way down the slide, 2/3 down the slide,
0.324 down the slide, whatever. Analog encoding
was invented long before digital, that's the way
radio waves work and we've had radios since the
early 1900's. The problem is, it's | | Answer 4:
A digital signal is a series of 0s and 1s. It
looks like this:
...
01010001011111100010101 ...
You can encode
information by using a particular series of 0's
and 1's.
For example, I could assign each
letter in the alphabet a code, so
that
00000 = 0 00001 =
a 00010 = b 00011 = c etc.
There
are 26 letters in the alphabet. How many digits
do you need to use per letter to be sure each
letter gets a unique digital code?
Now, how
does this code get transmitted from one computer
to another?
We can connect the two with a
wire, and apply 1 Volt to the wire to represent 1,
and 0 Volts to the wire to represent 0. Then at
any particular time the voltage on the wire would
be either 1 Volt or 0 Volts, but nothing in
between.
An analog signal can take on any
value, not just 0s and 1s. So if take my cable TV
signal and measure the voltage on the cable, then
I could measure any voltage between say, 0 Volts
and 5 Volts. (I don't know the exact voltages,
but you get the idea.) I mean, any voltage, like
3.29 Volts.
The way a TV works is that
there is an electron beam behind the screen. Now
the screen has a coating of phosphor that glows
when the electron beam hits it. The beam sweeps
across the screen back and forth, like someone
mowing a lawn, "drawing" one line at a time.
The analog TV signal (the voltage on the
cable) affects how intense the electron beam is at
any particular time. This determines how bright
that particular spot is on the TV screen.
Could you think of how we could use a
digital signal to tell the TV how bright to make
the electron beam?
Click Here to return to the search form.
|
|
|
|
|
Copyright © 2020 The Regents of the University of California,
All Rights Reserved.
UCSB Terms of Use
|
|
|