Slides - P2PSIP

Download Report

Transcript Slides - P2PSIP

HTTP Streaming
Goal and Scope Discussion
draft-wu-http-streaming-optimization-ps-03
IETF 79 – Beijing
November, 2010
Qin Wu, [email protected]
2017/3/27
1
Purpose of this presentation
• Food for thought
– Note that there lots of open questions,
– and the slides have many gaps we hope to fill
in as a result of the discussion at the ad-hoc
and later on mailing list.
2017/3/27
2
Outline
•
•
•
•
•
•
Goal of Ad-hoc
Introduction to HTTP streaming
Why is HTTP Streaming a popular topic?
Existing HTTP Streaming Work and Model
Problems for discussion
Next Step
Note that there lots of open questions, and the slides have many gaps
we hope to fill in as a result of the discussion at the ad-hoc and later on
mailing list.
2017/3/27
3
Goal of Ad-Hoc
• Goal
– Talk about HTTP streaming.
• Discuss possible direction forward:
– Define protocol or extensions for server, client
and/or smart cache capabilities to: 1) satisfy
subscriber QoE requirements for real time
application; 2) support interoperability with
existing streaming technology; 3) bring
efficient delivery mechanism and schemes
2017/3/27
4
What is HTTP streaming?
• Streaming is described as a method of transmitting the data over the
network as a steady and continuous stream, allowing playback
proceed while the subsequent data is being received.
• HTTP streaming refers to the streaming service wherein the HTTP
protocol is used for basic transport of media data
– Streaming service enables streaming contents to be received
and rendered simultaneously
– In order to reduce large packet dropout due to TCP, the
streaming media may be segmented into many chunks
– HTTP based progressive download is a special case of HTTP
streaming
2017/3/27
5
Why is HTTP Streaming a popular topic?
• The main motivations for "why HTTP " are web-based streaming
and multi-screen video transport.
• With multi-screen video support, a common user experience across
PCs, TVs, smart-phones, tablets and cars can be provided.
• Since almost all the clients have browser support, it is obviously a
good choice to use HTTP streaming to support multi-screen video
delivery.
• Some existing work in 3GPP, MPEG and others – will discuss more
in gap slides.
2017/3/27
6
A trend seen by everyone
•
From the most popular video website, HTTP Streaming + CDN is a way to go
– Youtube for video sharing
– Hulu for free high-quality video online viewing
•
From some statistic reports, we also can see the general situation
– According to ATLAS Internet observatory 2009 annual report,
streaming, CDN and direct download are growing, replacing P2P as the
dominant mechanism for sharing/distributing video.
– According to another report about Global Mobile Broadband Traffic Report from
Allot communications, HTTP streaming is the fastest growing application with a
rise of above 50%
2017/3/27
7
Existing Work or Components
•
•
•
•
•
Media Fragments URI (W3C)
HTML5 video playback elements (W3C)
Server Sent Events (W3C)
WebSocket API (W3C)
Media Presentation Description (3GPP)
– Client and Server Manifest in Microsoft
– M3U playlist in Apple
– F4F manifest in Adobe
• Streaming File Format (3GPP)
• WebSocket Protocol (IETF)
• More in the gap slides…save discussion for next
presentation
2017/3/27
8
•
•
•
Existing HTTP Streaming Model
(Client Based Pull)
Media is split into a series of data chunks
If several bit rates are available, the client can choose between different chunks of different
size or bit rate.
The client firstly acquires a manifest file containing the reference (e.g. URI) to each media
chunks from the streaming server, then requests the media chunks by forming a sequence of
HTTP request messages to the server
1.relies on client to handle
2.Better effort
delivery
buffer and playback during
download
3.Rely on
existing web
infrastructure
2017/3/27
9
Existing HTTP Streaming Model
(Message Flow)
The presentation
Desc. is used to
convey the index of
each fragments and
associated metadata
information.
2) polling for each
new data in chunks
using HTTP requests
HTTP GET
Web
Browser
Presentation Desc.
HTTP GET URL(frag1 req)
Web
Fragment 1
3) Web Browser will
pull media from the
server fragment by
fragment in
accordance with
presentation Desc
HTTP GET URL(frag i req)
Media
Decoder/
Player
Server
Fragment i
Media
Segmenter
URL is used to tell the
the server which
fragment the client is to
request
Media
Encoder
1) The media
segmenter is used to
split input media into a
serial of fragments or
chunks.
Audio/Video
input
Web/Media Server
2017/3/27
10
What problems we need to look at?
• P1:Inefficient Streaming Content Delivery
– streaming application is decoupled from existing web
infrastructure
– Web infrastructure may not satisfy real time streaming
media requirements
• Bigger size of HTTP header
• Rely solely on multi-connection for concurrency
• Transport degrade due to slow response of the server for
transmission rate changes
• Slow Timing control for driving request
– Not send chunk request until receiving the manifest
– Some approaches may not send new request for new media
chunk until receiving the media chunk in response to previous
request
2017/3/27
11
What problems we need to look at?
• P2: No QoE Improvement Support
– Best effort Internet
• The quality of Internet media streaming may significantly degrade due to
rising usage and concurrent streaming delivery.
– No subscriber QoE Feedback Support
• analyzing the system's overall performance is important to provide high
quality service
– there are no streaming quality control mechanisms like RTCP to report subscriber
QoE metrics that are important to the HTTP streaming system for congestion
control or diagnostic purpose.
• difficult to track in case of client based pull
– fails to give the server feedback about the experience the user actually had while
watching a particular video.
– the server may have a video that continually fails to start or content that rebuffers
continually while the Content owner receives none of this information.
– Channel Switching Latency
• Switching between live stream channel or switch from VOD channel to live
stream channel.
• additional round trips between the client and the server for manifest file
update before the client can request each new chunk.
2017/3/27
12
What problems we need to look at?
• P2: No QoE improvement supported( cont’)
– Rely on client to handle playback buffer and choose content quality
• smooth jitter caused by network bandwidth fluctuation, may further increase
user's waiting time.
• Is it reasonable to convey quality parameter using URL since URL have lots
of variances.
– Web Server overload in case of live streaming
• Web server bottleneck is how many concurrent streams can be served
• the server may sacrifice/downgrade quality to enable the process to keep
pace with live contents rendering for viewing.
– Solely Rely on Multi-bit rate (MBR) encoding
• suffer various quality downgrading, due to switching from high bit rate
stream to low bit rate stream, rebufferring when the functionality of MBR is
poorly utilized
2017/3/27
13
What problems we need to look at?
• P3: No service differentiation Support
– No distinction regular HTTP traffic from HTTP
Streaming traffic
• Disadvantage:
– Transport streaming media in the same way as web page
– transport Streaming media has no priority to be
delivered/processed first
• Open questions:
– rely on DPI mechanism to differentiate traffic by parsing
streaming file header?
– Shall we distinguish different traffic by HTTP header?
2017/3/27
14
What problems we need to look at?
• P4: No Streaming Distribution Component Support??
– Chunks can not be cached or not?
• Can Streaming media be cached in the same way as web page?
• Encryption/authorization may be an issue.
– How to reduce upstream bandwidth between the web server and proxy
• serve multiple incoming persistent connections with one upstream persistent
connection
• Build smart cache to allow it receive the whole response from upstream
before returning anything to the client?
• Can smart cache retrieve the manifest as all the receiver do or can smart
cache be signaled to cache the media chunks?
• Suppose chunk hints can be sent from the server and inform the
intermediary to help reduce server overload, how Chunk hints can be
perceived by intermediary
– How is HTTP live Streaming distributed into CDN in case of server
overload or live streaming serving.
• HTTP Redirection to deal with server overload lacks efficiency
• Can application running over HTTP on the smart cache can perceive server
load and tackle such server overload efficiently?
2017/3/27
15
Next Step
• Do we think there are real problems here
to be solved (even if the four we list aren’t
quite right)?
• Do we think the IETF is the right place to
work on this?
• How many people are interested in these
issues and would like to contribute?
2017/3/27
16
References
• http://www.adobe.com/products/httpdynamicstre
aming/
• http://www.streamingmedia.com/Articles/Editoria
l/Featured-Articles/First-Look-Flash-MediaServer-4-69867.aspx
• http://tools.ietf.org/html/draft-pantos-http-livestreaming-04
• http://www.iis.net/download/smoothstreaming
• http://tools.ietf.org/html/draft-wu-http-streamingoptimization-ps-00
2017/3/27
17
Additional Slides
2017/3/27
18
HTTP Streaming Use Case (1-1)
a.The presentation
Desc. is used to
convery the index
of each chunk ans
associated
metedata
information.
Live Streaming Media Broadcast
 Channel Switching Latency
1.HTTP GET
Web
Browser1
2.Presentation Desc.
3.HTTP GET URL(chunk x req)
Web
4.Fragment x
5.switching
b.Web Browser
will pull media
from the server
chunk by chunk
in accordance
with presentation
Desc.
Additional round trips for
HTTP connection setup also
risk the real-time feature
of live streaming
8.HTTP GET URL(chunk y req)
Media
Decoder/
Player1
9.Fragment y
Live Stream serving
Media
Segmenter
c.URL is used to
tell the the server
which chunk the
client is to request
Audio/Video
input
 Live channel switch
2017/3/27
Server
6.HTTP GET
7.Presentation Desc.
Additional round trips between
the client and the server for
manifest file update before the
client can request each new
chunk, which could risk the realtime feature of live streaming.
switch between live channels
Switch from VOD channel to live
channel
Media
Encoder
 Channel Startup Latency
During startup of live channel,
the client don’t know the current
time point of the content. The
new manifest to the live channel
retrieving result in user poor
experience with longer waiting
time.
Media Server
19
HTTP Streaming Use Case (1-2)
Live Streaming Media Broadcast
1.HTTP GET
2.Presentation Desc.
 Latency in the middle of
ongoing live session
 If the client can predict the
URL of new chunk based on
previous chunk header for live
streaming, it means extra delay
will be introduced to deep parse
previous chunk file header to
calculate the next chunk URL.
If the client just simply calculate
the URL of new chunk based on
index of previous chunk plus one,
Such latency can be decreased.
3.HTTP GET URL(chunk x req)
Web
Web
Browser2
4.Fragment x
Server
×
6.HTTP GET URL(chunk y req)
7.Fragment y
Media
Decoder/
Player2
Live Stream serving
Media
Segmenter
 Live session in the middle
 Server Initiated Push can be
used to deliver media stream
Audio/Video
input
Media
Encoder
In the middle of live session.
Media Server
2017/3/27
HTTP Streaming Use Case, IETF
79
20
HTTP Streaming Use Case (2-1)
- Multi-Screen Service Delivery
 Subscriber QoE Feedback
Fails to give the server feedback about the experience the user
actually had while watching a particular video.
The server may be paying to stream content that is rarely or
never watched.
 The server may have a video that continually fails to start or
content that rebuffers continually while the Content owner receives
none of this information.
 The intermediaries in the middle has no capability to report total
bandwidth consumption of large number of clients behind
intermediaries.
 Each client in the same network segment can not request
the same content with different bitrate.
 Suppose the server provides the same content with 3 bitrate,
e.g., 1M,10M, the client can not request the content with bitrate
range between 1M and 10 M.
2017/3/27
21
HTTP Streaming Use Case (3)
- Content publishing into CDN
Serving multiple incoming
persistent connections
Web
Web
Cache
Media
Segmenter
Server1
One shared
upstream
connection
Smart
Cache
Web
Client2
Media
Encoder
Web
Streaming Server
Web
Cache
Audio/Video
input
Server2
Smart
Cache
Content Delivery Network (CDN)
 Upstream Bandwidth Saving Issue
 upstream bandwidth between the server and proxy
may be over utilized.
2017/3/27
Web
Client1
Web
Client3
Web
Client4
 Server Overload due to Stream Concurrency
 concurrent stream from different End device will
flow from the same web server, which may saturate
the web server with too much load.
22
Architecture Consideration(1)
• Enhanced HTTP Streaming Pull model
Feedback on Quality of data delivery
support
 Reduce switching latency in case of
live streaming serving
 Allow deployment of some smart cache
which enables HTTP Streaming Traffic
Localization
 Allow the server send chunk hint
to the client
 Allow intermediate entities to parse
chunk hint passing through.
2017/3/27
HTTP Streaming Use Case, IETF
79
23
Architecture Consideration(2)
• Hybrid HTTP Streaming model
Feedback on Quality of data
delivery support
 Reduce switching latency in
case of live streaming serving
 Allow bidirectional
communication
between the HTTP client, HTTP
Server and intermediate entities.
 Allow bidirectional
communication between servers
 Reduce upstream bandwidth
Consumption
2017/3/27
HTTP Streaming Use Case, IETF
79
24
Suggestions
• Two ways to go:
– Change the existing HTTP protocol to support
HTTP streaming if challenges can be satisfied
by Enhancement of HTTP pull model
– Extend websocket protocol to support HTTP
Streaming if challenges can not be satisfied
by Enhancement of HTTP pull model
2017/3/27
HTTP Streaming Use Case, IETF
79
25
Traditional Streaming Solutions
Conventional streaming
solutions
- use RTP/UDP/IP for
media data transport,
encapsulated as RTP
packets
-RTSP for session control
- SDP for session
description
- RTCP for QoS control
2017/3/27
26
Server Overload
 Niginx web server can serve up to 80 concurrent streams without any problem
 When serving 90 concurrent stream, 1% of all requests take longer than the video duration,
buffer underruns may occur
 When serving 140 concurrent streams, the saturation is too high and the mean request time
goes over 10 sec, causing systematic buffer underruns for all connected users.
2017/3/27
27
End to End Delay Comparison
•
For live streaming
End to End Delay = transmission delay + network delay + playback delay
•
For on-demand streaming:
End to End Delay = network delay + playback delay
2017/3/27
28
Playback control in Existing Model
 Playback control w/ RTSP
 In Microsoft HTTP streaming, the
RTSP headers are embedded
in the Pragma headers of HTTP
messages.
 In RealNetworks and QuickTime
HTTP streaming, the RTSP
commands are
embedded in HTTP message bodies
with the base64 encoding
format.
Playback
Control Unit
 Playback control w/o RTSP
In other implementation, client
perform playback control by driving
HTTP request or running script at the
client side.
C:
TEARDOWN
rtsp://audio.example.com/twister/audio.en/lofi
RTSP/1.0
PLAY
PAUSE
rtsp://audio.example.com/twister/audio.en/lofi
rtsp://audio.example.com/twister/audio.en/lofi
RTSP/1.0
RTSP/1.0
S:
RTSP/1.0
200 11 OK
S:C:
RTSP/1.0
SETUP
rtsp://audio.example.com/twister/audio
200
OK
RTSP/1.0
Session:
4231
Session
4231
Session:
4231
Range: npt=37
npt=0Transport:
Session
4231
rtp/udp;
compression; port=3056; mode=PLAY
2017/3/27
29