A Perspective on Voice and Data Services Benchmarking

Download Report

Transcript A Perspective on Voice and Data Services Benchmarking

[ ]
ITU-TQSDG – DUBAI
METHODOLOGIES AND TOOLS FOR AUDITING THE QUALITY
OF SERVICE
[ TOOLS AND METHODOLOGIES]
[ ]
AGENDA
 Technology changes
 Latest testing methodologies
 Testing techniques
 Sharing experience
2
[ TOOLS AND METHODOLOGIES]
Faisal Ghazaleh
ITU-T QSDG, Dubai, November 2014
[ TOOLS AND METHODOLOGIES]
Faisal Ghazaleh
ITU-T QSDG, Dubai, November 2014
[ TOOLS AND METHODOLOGIES]
Faisal Ghazaleh
ITU-T QSDG, Dubai, November 2014
[ TOOLS AND METHODOLOGIES]
Global Wearable Devices Traffic Impact:
Mobile Data Traffic and Offload Traffic, 2018
51 Percent of Total Mobile Data Traffic Will Be 4G by 2018
MESSAGE: DATA TRAFFIC EXPLOSION
Global Mobile Data Traffic Forecast by Region
Data Hungry Application Distribution
Faisal Ghazaleh
ITU-T QSDG, Dubai, November 2014
ANSWER: UNDERSTAND THE IMPACT OF THE NEED TO FOCUS
ON CUSTOMERS
Subscribers focus on their perceived experience with offered service (e.g. voice, mobile video, gaming), and
operators need to cost efficiently manage and control complex 4G networks while coping with high traffic growth
The wireless operators’ problem at its core:
4G ecosystem deployments allow very high data rate apps efficiently delivered
to a broad range of devices. This raises subscriber expectations for fixed-linelike service experience (where mobility is not an excuse). This is
fundamentally driving the capacity (QoS) crunch inside the network.
Therefore: operators face a continuous struggle to maintain high QoE with
continual capacity constraints due to spectrum and cost limits.
Solution: Customer experience centric network
testing and monitoring, and network optimization
process automation
Customer
expectations
and demands
Capacity
management
(capacity
optimization/load
balancing)
Coverage/Interference
control
MESSAGE: User satisfaction vs. operators challenges
Faisal Ghazaleh
ITU-T QSDG, Dubai, November 2014
RCS*/
Mobile Cloud
IMS**
Multi core
Multi RAT
*IP Multimedia Subsystem
**Rich Communication Suite
[ TOOLS AND METHODOLOGIES]
Advance offerings and
complicated networks
Faisal Ghazaleh
ITU-T QSDG, Dubai, November 2014
Keep the
testing
methodology
simple
[ TOOLS AND METHODOLOGIES]
Reproducing the same environment requires to test like a customer using same
application on the same device in the geographical location of the customer.
Faisal Ghazaleh
ITU-T QSDG, Dubai, November 2014
[ TOOLS AND METHODOLOGIES]
Where the T&M is heading
Metrics
CAPEX and OPEX
Licenses
QoE
KPIs
IEs
Moving closer to the QoE
Faisal Ghazaleh
Common
component
Automated
testing and
monitoring
Moving to the cost efficiency
ITU-T QSDG, Dubai, November 2014
Moving to the could
[ TOOLS AND METHODOLOGIES]
REGULATOR CHALLENGES
•
•
•
•
•
KQIs/KPIs consistency across the different operators.
Time to publish the results and make available online.
Accuracy of the results.
Frequency of the test and cost of the benchmarking/auditing.
Varity of the offered services like VoLTE, OTT, VoHSPA, and video-audio services
RCS-e.
Faisal Ghazaleh
ITU-T QSDG, Dubai, November 2014
[ ]
SHARING EXPERIENCE: UNFIED SOLUTION AND KPIS/QKIS
[ TOOLS AND METHODOLOGIES]
GLOBAL PRACTICIES BY TELECOM REGULATORS
AUTOMATIC DATA
PROCESSING WITH
TEMS™ DISCOVERY
ENTERPRISE
FIELD DATA COLLECTION BY TEMS™
AUTOMATIC AND INVESTIGATION
Faisal Ghazaleh
ITU-T QSDG, Dubai, November 2014
Logfile
PSTN
Command Test
[ TOOLS AND METHODOLOGIES]
GLOBAL PRACTICIES BY TELECOM REGULATORS
AUTOMATIC DATA
PROCESSING WITH
TEMS™ DISCOVERY
ENTERPRISE
Command Test
Faisal Ghazaleh
ITU-T QSDG, Dubai, November 2014
Logfile
PSTN
FIELD DATA COLLECTION BY TEMS™
AUTOMATIC AND INVESTIGATION
ANSWER: TEST THE NETWORKS LIKE A REAL CUSTOMER IS
USING IT: A CUSTOMER CENTRIC TESTING SOLUTION (ODM)
MESSAGE: testing philosophy: to be highly
Innovative by designing measurement metrics
suited to the 4G/5G world: ODM solution
 Enables testing of end user terminal
QoS settings and IP stack
characteristics
 Provides framework for future services
to be added easily and controlled by a
single client (e.g. Blixt implementation)
Faisal Ghazaleh
ITU-T QSDG, Dubai, November 2014
Service M
IP Logging
 Repeatable “control scripts” supported
across multiple devices (repeatable
testing)
Service B
Call Control
First to test VoLTE-ViLTE as a user via
on-device VoLTE client , test everything
ON Device
 Server resides inside the terminal,
managing the supported services
(VoLTE, IP Logging, Call Control etc)
Service A
VoLTE
 Laptop-based drive test solution
maintains control and coordination
through an On-Device Server
QoS setting per service
TCP/IP (SIP, RTP, etc)
On-Device
Server
BENCHMARK LIKE A USER
Use collected device data to
benchmark your network
performance against competitors.
Benchmark network coverage against
competitors to ensure
customers receive top-quality
service indoor and outdoor.
Quickly see subscriber experience issues
using handset score that aggregates
dropped and
blocked calls, handover failure,
downlink/uplink
throughput
analyses.
Aggregate handset
data analytics.
Clearly display the best
and worst performing handsets
using six key metrics.
Faisal Ghazaleh
ITU-T QSDG, Dubai, November 2014
[ ]
SHARING EXPERIENCE: CUSTOMER AGENTS
[ TOOLS AND METHODOLOGIES]
REGULATOR CHALLENGES
• Monitor a targeted group of end users based on the complains like a users in a
certain location/with a certain mobile phone.
• Regulators do not have access to the OSS data of the operators.
• Interact with the end user right after the event.
• Reaching the customer wherever they are across the country.
Faisal Ghazaleh
ITU-T QSDG, Dubai, November 2014
GLOBAL PRACTICIES BY TELECOM REGULATORS
Engineering targeting – visualized through heat-maps
Faisal Ghazaleh
ITU-T QSDG, Dubai, November 2014
19
Indoor / Outdoor
Faisal Ghazaleh
ITU-T QSDG, Dubai, November 2014
20
LIGHT drive testing
Single / Multi probe analysis
Test drill down and replay
Cell info on map
Faisal Ghazaleh
ITU-T QSDG, Dubai, November 2014
21
Handset functionality
 Wide range of service testing in both interactive mode as well as a remote
controlled probe
Faisal Ghazaleh
ITU-T QSDG, Dubai, November 2014
22
[ ]
SHARING EXPERIENCE: INNOVATIE NEW KQIS
BLIXT™ ABM COMPARED TO FTP (DL EXAMPLE)
 Throughput values for DL ABM correlate extremely well to FTP
 For ABM, DL and UL throughput and delay are obtained simultaneously
FTP intrusiveness= 100%
ABM intrusiveness* = 5%
95%
less resources
LEGEND

Blixt actual PHY (low)

FTP actual PHY (high)

Blixt app layer throughput

FTP app layer throughput
Blixt by Ascom © ASCOM 2014
24
ABM Accuracy AND LEVEL OF INTRUSIVENESS
Effective
zone
of ABM
The accuracy displayed in
the table on this slide is a
comparison of the Blixt™
result as compared to the
actual available
bendwidth in the network
under test. It is not a
comparison with FTP!
Less
suitable
FTP as a test method is
likely to be significantly
less accurate than
Blixt™ under non-ideal
conditions.
Measurements done in a perfect radio environment
Blixt by Ascom © ASCOM 2014
25
Common Questions Answered
•
“Is it not better to use FTP since it is a standard application and better
represents the user experience?” Answer: FTP is clearly not an application
commonly used by smartphone users. The objective of testing available
bandwidth is to determine this characteristic of the network, and not how well
FTP performs in the network on a specific device. This is essentially what the
FTP legacy method measures.
•
“How do we know the results are accurate?” Answer: Other customers have
tested the technology and found it to be accurate, In reality, FTP is less likely to
give an accurate result as it is more susceptible to be negatively affected by
conditions of the test session not related to the available bandwidth.
•
“We are happy with our current test methodology, why would we change?”
Answer: With legacy technologies such as 3G, FTP and similar legacy methods
may be perfectly adequate since the available bandwidth is modest. With LTE
bandwidth approaching or exceeding 100 Mbps, these legacy methods become
at best highly impractical and at worst impossible to use, for a variety of reasons.
Please see “Speedtest vs Blixt™” comparison.
Blixt by Ascom © ASCOM 2014
26
CONCLUSIONS
 PLEASE VISIT OUR WEBSITE:
http://www.ascom.com/nt/en/index-nt/about-us-network-testing/nt-about-usresources.htm/
White papers: VoLTE, Video Streaming, HetNets, Carrieir Aggregation
and...watch the space: eMBMS testing to come soon
Informa Webinar: Advanced testing with Ascom in LTE networks
Webinars: VoLTE
Dr. Irina Cotanis
ITU-T QSDG, Dubai, November 2014
[ ]
QUESTIONS ?
[ ]
THANK YOU
Dr. Irina Cotanis
ITU-T QSDG, Dubai, November 2014
CONCLUSIONS
A FULL 24/7 QoE/QOS CYCLE SOLUTION FOR MOBILE NETWORK AND
SERVICES DESIGNED TO MEET CUSTOMERS’ SATISFACTION
My Customer
Experience
Data
Automated data
correlation
and presentation
My competitors’
customers
(Benchmark)
Providing network and
customer experience centric diagnoses for voice
(VoLTE, OTT, VoHSPA),
and video-audio services
(OTT, RCS-e/Joyn)
My customers,
real field agents
Automated
“what,” “why”
and engineertrusted “how”
scenarios
Dr. Irina Cotanis
ITU-T QSDG, Dubai, November 2014
Built-in best practices
scripted data analytics
provide automated root
cause analysis