DATA AND FILE FORMATS
Download
Report
Transcript DATA AND FILE FORMATS
DATA AND FILE FORMATS
Data and file format standardization is crucial for
sharing of data among multiple applications and for
exchanging information between applications.
Personal - computer(PC) industry has generated many
different standards. Text based file and data formats have
been replaced by multifunction formats which can
handle graphics, audio, video and color images.
A few of the commonly used data and file formats are:
1.
Rich Text Format ( RTF)
1
Early text editors could not carry through formatting
information when transmitting files. This limited data
interchange because when text was moved from one
application to other, all the formatting information was
lost and had to be re- entered.
The RTF format extended this range of information. RTF
is capable to handle binary files, audio files and video
files to a certain extent.
2.
Tagged Image File Format ( TIFF)
Tagged Image File Format has been around for a long
time. In this format, tags are used to keep all the attribute
information in a standard manner.
2
• TIFF file provides tags that store information about
resolution of the image, fonts, format color, compression
scheme, date and time of capture, decompression, etc.
• A search through file is quick since these can be found
easily.
• In case you have to extend the file, it is done through
pointers and links. And by creating extra blocks.
• This approach represents industry standard to represent
raster image data generated by scanners, frame grabbers
and paint/photo retouching applications.
• It can represent color images in different types
representations. This standard can handle color images
3
very well.
3.
Resource Interchange File Format (RIFF)
• RIFF is not really a new file format. Rather, it provides
a framework for multimedia file format for Microsoft
windows based applications.
• It can be used to convert a custom file format into a
RIFF format and transmit the file. For example a MIDI
file format is converted to RIFF by adding the RIFF
structure around it. Information is in blocks - called
chunks.
• Like TIFF, RIFF is a tagged file format and uses tags to
store information in the header, about the file.
• RIFF can handle MIDI, DIB, PAL, AVI files.
• MIDI - is musical instrument digital interface form.
4
DIB: Device Independent Bit-Map file
It can synchronize audio and vidio in movies.
Other commonly used format are:
4.
Joint Photographic Expert Group (JPEG) - DIB
Format for Motion Images
Microsoft has extended the DIB file format for both
JPEG still and motion images. This can be used with
RIFF and AVI file format.
5.
Motion Pictures Expert Group (MPEG) Format
MPEG1, MPEG2 are existing standards
6.
Audio-Video interlaced file format ( AVI)
7.
TWAIN Format ( for multimedia applications).
5
Multi-Media Input/Output Technologies
• Multimedia can mean different things. It can be an
encyclopedia on a CD-ROM or a hypermedia message
composed by a user consisting of text, images, full
motion video.
• Hypermedia links allow tracking of a subject matter
through a variety of topics.
• It takes specialized equipment to capture and store
multimedia objects.
Keyboard has been traditional input device for entering
data into computer system. It has changed from simple
numeric device to alphanumeric and multifunction
6
device over the years.
With the advent of GUIs, pointing devices, such as
mouse or a pen, have become essential for selecting or
moving graphical objects. Window based GUI
applications require a mouse or pen for selecting
various objects, push buttons, data entry boxes, so on.
In addition to traditional alphanumeric data entry,
multimedia technology requires a variety of other
types data inputs including voice or audio, full motion
video, still photos and images. These inputs require
special devices such as digital pens, audio equipment,
video cameras, and image scanners.
In case of text, there was no measure of quality. Text
was stored in ASCII/EBCIDIC formats. Now with
7
higher quality multi-font printers, the text quality is
measured in terms of print matrix resolution, text
color, font types, etc. The text capturing device does
not determine the end quality of the text.
Multimedia objects such as images, audio and video
depend on input device and storage for quality.
However, the capture device determine the outer
bound of the quality. The display device, at best, can
match the resolution of capturing device.
Digital verses Analog Inputs
Another important distinction with multimedia objects
is the need to convert data from analog to digital form.
8
For example a scanner scans an object into scan lines and
pixels and then converts analog amplitude of each pixel
into a digital measure.
0 or 1 for black and white.
0 - 255 for gray scale pixel points.
HSI or RGB for color objects.
TV signal ( NTSC) is also analog. It needs to be
converted into digital form for use in computer system.
The process for converting analog to digital and digital to
analog are called as coding and decoding.
Hardware devices and software programs for this
implementation are called as codecs.
9
Codecs usually include compressions and decompression
algorithms.
Different codecs are required for each type of multimedia
inputs.
Display and Encoding Technologies
Since multimedia systems include a variety of object types,
a number of different technologies are required for
compression, decompression and display of multimedia
objects.
Almost all multimedia objects are based on a graphical user
interface (GUI). Most graphical user interface are based on
VGA ( 640x480 pixels) or SVGA (800 x 600 pixels)
10
or even XVGA(1280 x 1024 pixels). Some imaging
applications may require 150 - 200 pixels per inch or
better resolution.
Voice mail system store analog sound and are usually
based on adaptive differential PCM technology. Codecs
are required for converting analog sound to digital
formats such as WAVE or AVI.
Video cameras provide input in analog formats such as
NTSC ( national television system committee ) standard,
PAL ( phase alteration line standard) or SECAM (
France). Input from either source must be encoded to
digital format and decoded for transfer back for analog
play back.
11
Encoded compressed digital signals are based on JPEG or
MPEG standards. Other formats include AVI and RIFF.
Resolution and Bandwidth Issues
Each object type has some resolution. Images are
measured in pixels per inch. Higher the resolution, better
the object quality.
For document imaging systems, screen resolution of 100
pixels per inch ( ppi) are required.
• The quality of 200 ppi is very good.
• Laser printers and office copiers can provide a
quality of 300 - 600 ppi.
• Published ( professional quality books) have
12
resolution of 1200-1800 ppi.
Sound quality is measured in terms of sampling rate and
number of bits used for representing magnitude of the
sample.
• A higher sampling rate allows capturing of higher
frequency details.
• Higher number of bits allow capturing of amplitude
changes more accurately.
• Both factors contribute to the tone quality.
• A sampling rate of 4 kHz at 8 bits is considered as
minimal acceptable for voice grade sound.
• A sampling rate of 8kHz at 16 bits is required for
music quality.
13
• For CD-quality stereo sound, the sampling rate of
44.1kHz at 16 bits is required.
• Multi-channel stereophonic sound requires even
higher resolution.
The VCR quality is considered a minimum for video
display which is defined as
•
300 lines visible on the screen. The minimum
acceptable resolution is 320 x 240 pixels.
• HDTV quality is 1280 x 1024 pixel range
Another measure of video quality is number of bits being
used for color definition. A 16 bit palette is common.
Higher color resolution is required for HDTV quality. 14
This will be 24 bit colors ( full color).
A third measure is the number of frames per seconds.
TV operates at 60 FPS
Multimedia Object Quality and Transmission Bandwidth
page 192.
15
Multimedia Input and Output Devices
Electric Pen
When an electric pen is used to write or draw, the digitizer
encodes the x and y coordinates of the pen, and the pen
status, which includes whether the pen is touching the
digitizer surface ( usually the screen) or not, pen pressure,
pen angle, rotation, etc.
Most electric pen contain a micro-switch at the tip that
behaves like left button on the mouse. Some pens are
capable of measuring accurate pen pressure while others
can measure the proximity.
Pen computing requires generating x-y coordinates at least
16
120 times per second with 200 dpi resolution. The
minimum sampling rate generates sufficient data to track
pen movement.
Most pen digitizers produce an accuracy of 0.005 to 0.02
inch resolution. Resolution is defined as the number of
points digitizer is able to digitize in one inch.
Video and Image Display Systems
TV is video technology. Live pictures bring reality in our
environment. They educate us.
Introduction of video game took younger generation by a
storm.
17
Virtual reality will be the next major advance in game
technology, in military technology, and in training
environment.
VGA, SVGA, XVGA, 8514A, are some of the existing
video technologies.
Display Performance Issues
There are three main factors that affect the performance:
1. Network bandwidth: the play back becomes choppy and
incoherent if the bandwidth is insufficient to support
minimum data rate. There are JPEG and MPEG standards
to define this parameter.
2. Decompression or Decoding: once again, while in the
18
case of poor decompression, performance causes irritation
delays. In case of full motion video, poor decompression
causes same effect as poor network bandwidth.
3. Performance of Display Technology: if the technology
is not appropriate, the device may not display full motion
or graphics properly.
Tables show various video standards.
19
20
21
Typically, 14 inches monitors have active display area
of 9.875” x 7.125” and diagonal of 12.25”.
If we assume a resolution of 1024 x 768, the dot pitch
can be defined as ( distance between two pixels):
9.875
dp(h)
x 25.4 mm 0.24mm
1024
7.125
dp(v)
x 25.4 mm 0.24mm
768
This means that to display a resolution of 1024 x 768
very clearly, we need a 0.24 mm dot pitch monitor.
Most 14-inch monitors come in 0.28 - 0.30 mm dp.
When these monitors display 1024 x 768 pixels, the
22
accuracy and crispness are compromised.
Now let us repeat the same calculations for a 17-inch
monitor which has 12.9” x 9.76” size with a diagonal of
16.125 inchesThe dp for 1024 x 768 pixels will be
12.90
dp(h)
x 25.4 mm 0.32mm
1024
9.675
dp(v)
x 25.4 mm 0.32mm
768
This means to display a resolution of 1024 x 768 on a 17inch monitor, you would require a dp of 0.32 or better.
Most of the 17-inch monitors have a dot pitch of 0.280.30. This is sufficient even to display a resolution of
1280 x 1024.
23
Smaller dot pitch gives a perception of clearer picture.
Horizontal Refresh Rate: is a measure of the rate at
which scan lines are painted. It is measured in kHz and
a standard VGA monitor has a horizontal refresh rate
of 31.5 kHz
Vertical Refresh Rate: is closely tied to horizontal
refresh rate. It the rate at which whole screen is painted
( counting all scan lines) and return to the top of the
screen.
24
It is measured in Hertz. Typically it is 50-72 Hz.
Human eye is sensitive to lower vertical refresh rates
and is more likely to perceive flicker at lower rate. It
can be annoying and tiring to eyes.
Print Output Technologies
• Laser print technology has continued to evolve and print
quality at 600 dpi is starting to make this technology
useful for high speed process.
• Offset printer resolution is around 1200 - 1800 dpi. 600
dpi is sufficient for common print applications.
Digital Voice and Audio
25
• Multimedia technology is multidimensional and audio is
Comparison of Print Technologies
26
one of the dimensions that adds voice, music and sound
capabilities.
• Until 1990, PC applications were visuals.
applications added voice/music dimension.
Game
• Today, some applications utilize sound boards whereby
audio inputs may be through keyboards, microphones, etc.
Digital Audio
• When voice or music is captured by a microphone, it
generates electrical signal.
• The signal consists of fundamental sine wave with
certain frequency and amplitude.
• The fundamental sine wave is accompanied by
27
harmonics.
• Adding the fundamental to harmonics, forms a composite
sinusoidal signal that represents the original sound.
• Analog sinusoidal waveforms are converted to digital
format by feeding the analog signal to A/D converter
(ADC) where the analog signal goes through the
sampling process.
Sampling Process:
• The analog signal is sampled over time at regular
intervals to obtain amplitudes of the signal at
sampling time.
• The regular interval at which sampling occurs is
called the sampling rate
28
Voltage
Time
1
Sampling rate
1
T
sample time
T = Time interval between two samples
29
• The sample amplitude obtained at sampling
instants is represented by an 8-bit value (one byte)
or 16-bit (two bytes) value.
• Higher values can also be used for higher
resolution systems ( high fidelity sound).
• A composite signal of 11.025 kHz sampled 4
times every cycle will yield 44.1 kHz sampling
rate. If you sample at higher rate, you need to store
more samples.
• For CD quality music at 44.1 kHz rate at 16-bit
resolution, a one minute recording will require
44.1 x 1000 x 16 x 60 / 8 = 5.292 Mbytes.
30
Audio objects generate a large volume of data. This poses
two problems.
First, it requires a large volume on disk space to store, and
Second, it takes longer to transmit this data.
To solve these problems, the data is compressed.
Compression helps shrinks the volume of data and less
disk space is required. It also helps to reduce network
time.
Audio industry uses 5.0125kHz, 22.05kHz and 44.1kHz
as standard sampling frequencies. These frequencies are
supported by most of the sound cards.
31
Digital Camera
Digital cameras are being used increasingly for
multimedia applications due to internet advantage they
provide in applications where very high resolution is not
required. The advantages are:
• Digital images can be viewed immediately for
proofing
• Digital images can be printed immediately and any
number of times for duplications
• Digital images can be integrated with wordprocessor documents
• Can be embedded in emails or faxed by computers.
32
• Can be enhanced/altered for effective presentations
• Can be archived - minimizing the risk of loss or
damage to the image.
• Can take images of 3-D objects and store as 3-D
images
• Digital cameras are portable and can be used in
environment where film cameras can not be used.
Possible uses: for fingerprint analysis, for drivers’
licenses,
insurance
companies,
bank-customers’
signatures, security installations, etc.
33
Full Motion Video
Although video image processing is not very common for
full motion video, there is no reason why it will not be
done in future.
Video
capture
circuit
cards
capture
from
NTSC/PAL/SECAM signals from video cameras or
VCRs or even s-video inputs ( RS 170 inputs).
Video capture boards can handle audio signals as well,
they can convert analog signals to digital (ADC) and
digital to analog forms(DAC)
Normally, a video capture board is used to capture realtime video, and the digitized raw data is then compressed
34
in real-time.
The compressed data is subsequently moved to CPU
over ISA or local bus. The CPU then builds the AVI file
format for the compressed data and stores the file.
During the playback, the file is read in blocks by the
CPU, and the data is decompressed as blocks of audio
and video.
The data can be decompressed in either software or
hardware and sent to VGA card for display.
To understand performance issues, let us take an
example: calculate the bandwidth required to display a
real-time video at 640 x 480 resolution at 30 Hz frame
rate in true 24-bit color.
35
Bus Bandwidth
The bandwidth required for display of full motion video
is- resolution x frames per sec x pixels per bit for color.
640 x 480 x 30 x 24 = 27.648 Mbytes/sec
This bandwidth is required to display real-time video in
true color 24-bit per pixel mode.
If you want to reduce from full color mode to 256
colors only ( 8-bit color), the bandwidth requirement
will reduce to 9.216 Mbytes/sec
The ISA bus operates at 8 MHz and has a bandwidth of
2 Mbytes/sec, which is not sufficient for above
application. This gives you following options:
36
• Display a video window of 300x200 at 30 frames per
seconds with 256 colors.
300x200x30 x 8 = 1.98 Mbytes/sec
• Display the video window at full VGA resolution of
640 x 480 at 6 frames per second with 256 colors.
640 x 480 x 6 x 8 = 1.84 Mbytes/sec
It is clear that ISA bus is a big bottleneck. However, the bus
bandwidth problem can be solved by using other bus
architectures, such as local bus, VESA, VL bus or PCI bus.
Both VL and PCI buses have bandwidth in excess of
100Mbits per second. In theory, local bus operates at CPU
speed.
37
To achieve good performance, every link in the chain for
capture and playback must be examined carefully to
ensure that the required bandwidth can be carried by
that link, be it network, video server, compression or
decompression hardware or even display system.
Animation:
It is an illusion of movement created by sequentially
playing still image frames at a rate of 15-20 frames per
second ( close to full-motion video range).
Animation film contains a series of frames with
incremental movement of objects in each frame. It may
not be necessary to move objects to create the illusion of
38
movement. Color and background can be changed from
frame to frame so that there is perception of moving
object.
MEMORY SYSTEMS
Memory systems for computers have been changing to
meet the needs of high resolution graphic displays. The
demand on memory systems will even be greater with the
increased use of multimedia applications.
Memory Types
Different types of memories are used for different
purposes due to retention factors, performance parameters,
and cost trade-offs.
39
Memory types that may be used in multimedia systems
include the following:
1. ROM (read only memory): is read only.
Instructions and/or data is burned into the memory
permanently, and the contents are non-volatile.
ROM is used for firmware. That is: for operating
systems, software programs that have to reside
permanently inside computer.
2. PROM (Programmable ROM): is semiconductor
memory that contains an array of fuses. These fuses
are blown according to the word to be programmed.
To program, a specialized PROM programmer
(PROM burner) is used. This burner blows the fuse.
40
The contents of this PROM are non-volatile. The
access to PROM is random. Typical data-path is
16-bits wide.
3. RAM (Random Access Memory): is also
semiconductor type of memory that allows
random access to its contents.That is, the word
can be accessed by directly addressing it.
It is organized in an array form so that it can be
read and written efficiently. All words are
addressable. There are several types of RAMs:
SRAM (static RAM): It is semiconductor
memory consisting of transistors which can
remember the information.
41
These transistors do not require periodic charging
to maintain the information. It is read/write type.
The organization is array type to facilitate read and
write operations. SRAM access speed ranges from
a few nanoseconds to 30 nanoseconds.
SRAM is volatile and loses the information when
power is switched off.
4. DRAM (Dynamic RAM): it is semiconductor
memory where information is stored in a
capacitor. The term dynamic is used because
capacitors require periodic charging to maintain
the information. This process is known as periodic
refreshing.
42
Capacitors are used as memory cells and can
achieve high cell density.
The trade-off to high density is periodic refresh.
DRAM is mainly used as main memory of the
computer. The access speed ranges 50-80 ns. It is
volatile and the information is lost if no power or
no refresh.
5. VRAM ( Video RAM): It is like DRAM. The only
difference is that it is dual-ported. The CPU port (
processor port) is standard port, similar to DRAM,
containing data path and address path. In addition,
there is a video port. The video port contains a
buffer to hold a complete row of data.
43
This buffer is organized in such a way that it can hold
a complete data for a horizontal line. Each horizontal
line represents one row of the screen data.
The advantage of the VRAM is that the whole
horizontal line of video screen information is loaded
into the buffer in one scoop.
Buffer’s output is then converted from parallel to
serial and output as a video stream.
With dual porting screen, updates can be done in
almost half the time.
With VRAM, the port to the CPU is available 90% of
the time to do the updates.
44