Wireless Networking
Download
Report
Transcript Wireless Networking
NETWORKING;
TODAY AND
THE FUTURE
Wireless Networking
• Computers are connected to a ‘wireless network’ through wireless base stations
that allow them to connect dozens of computers at a time without having to lay
expensive physical infrastructure
• Using the IEEE 802.11 wireless protocol, the base stations have a range of up
to 150 meters and will connect even when there are walls in the way
• The cost of wireless networking technology would continue to fall as bandwidth
expands and doubles from the existing 11Mbps to 22 Mbps by mid 2001
• Much improved security is ensured, with encryption that was on a per-user persession basis, as opposed to a shared-key encryption that was common to all
users in a session e.g. by using Lucent’s brand new AS2000 technology
Open Sesame Concept & Protocol
• The concept of ‘open wireless systems’ refers to standards-based interface
protocols between radio base stations, mobile switches and other networkinfrastructure components
• Wireless network systems are becoming very common due to high volumes of
data transfer, that is time-consuming on land-lines
• The optimal network system is attained through an open, plug-and-play system
approach
• There is a wide variation depending on the markets served. Some regions require
sophisticated, feature-rich applications with high bandwidth and high mobility for
both voice and data services. Other regions need only a fixed-wireless or WLL
application that provides basic POTS-like service
• Standards such as ANSI-41 are defined and widely followed in order to implement
and integrate internet solutions from multiple vendors in a single network
• Similar to the ANSI-41, the ‘A interface’ gives providers more flexibility to choose
between radio-base-station and mobile-switch infrastructure vendors
• Standards are being forged and open-infrastructure systems are becoming a
reality. The ANSI-634 standard is in place today for AMPS, TDMA and CDMA
• Even with an approved standard, it will be difficult to provide interoperability of
multiple-infrastructure-vendor systems. Service providers are going to drive the
evolution to open interfaces, allowing them to deploy multivendor base stations in
their networks
Satellite-based Networks
• By 2002 broadband Internet access via satellite will be price and speed competitive
with the land-based competition
• In order to communicate over the satellite, the TCP/IP protocol will have to be
customized
• It will take 600 milliseconds (approx.) to transfer the information packets over
TCP/IP, 200 milliseconds more than it’s land-based competition
• LEOS orbit the Earth at a height of just 500 to 1,000 miles. This makes them
capable of providing smaller, more energy-efficient spot beams, and delivers latency
potentially equal to (or better than) transcontinental fiber optic cable
• There are physical restraints in deploying LEOS such as data latency issues
and also communication between the ‘birds’
• High Altitude Long Operation (HALO) technology by Angel Technologies in
LA, uses small planes that act as LEOS but with reduced latency problems and
maintenance issues
• Analysts say that satellite technology will not be available soon. The real
problem is economics. Though satellite vendors tout their abilities to bring
high-speed networking anywhere in the world, they wont be able to make a
living serving only developing nations. They need to gain revenue serving
wealthy, bandwidth-hungry places such as Silicon Valley. Unfortunately, those
areas already have plenty of terrestrial broadband options, and customers
wont necessarily flock to towers in the sky.
Neural Networking Technology
• The technology studies the target environment, becomes familiar with its daily
behavior, identifies when the system is going out of its normal operating state and
notifies network administrators
• A typical prediction of a problem issued by Neugents (CA’s Windows NT serverbased network agents) may look like: "At 11:00, Neugents predict a 95% probability
that server AB232 will run out of virtual memory in approximately 45 minutes."
• Analysts believe that Nugents is susceptible to failure because;
• Failure to estimate the unpredictabilities, faced by a ‘normal’ server
•Nugent’s initial focus is on systems, primarily NT, with no support for the
network/application infrastructure
• Nugent doesn't take into account the interdependencies of multi-level protocol
stacks involved in every networked application transaction
Thank you for being such a good
audience !
Please let us know if you have any
questions.