Sonal Mahajan - University of Southern California

Download Report

Transcript Sonal Mahajan - University of Southern California

Root Cause Analysis for HTML Presentation
Failures using Search-Based Techniques
Sonal Mahajan, Bailan Li, William G.J. Halfond
Department of Computer Science
University of Southern California
What is a presentation failure?
• Web page rendering ≠ expected appearance
Expected appearance (oracle)
Web page rendering
What is a presentation failure?
• Web page rendering ≠ expected appearance
Difference 1:
Alignment problem
Expected appearance (oracle)
Web page rendering
What is a presentation failure?
• Web page rendering ≠ expected appearance
Difference 2:
Color problem
Expected appearance (oracle)
Web page rendering
What is a presentation failure?
• Web page rendering ≠ expected appearance
Difference 3:
Style problem
Expected appearance (oracle)
Web page rendering
Presentation Failures
• Common in modern web applications
– Highly complex
– Dynamic nature of HTML, CSS, Javascript
• Difficult to diagnose and debug
– Each page has hundreds of HTML elements
– Each HTML element contains several styling
properties
Why is handling presentation failures
important?
• Presentation of a website
– factors company branding
– gives first impression about your business
• Presentation failures can
– impact usability
– negative perception about quality
When do presentation failures occur?
1. Front-end developer did not comply to pixelperfect implementation [1]
2. Refactoring of UI
3. Web application was not tested sufficiently
Need to Debug Presentation Failures
• Throughout the development process
• 3 such scenarios 1. Presentation Development Testing
2. Regression Debugging
3. Standard Debugging
1. Presentation Development Testing
• Front-end developers
– Expected to convert mockups to “pixel perfect”
template pages
“Pixel-perfect” pages… Is it reasonable?
“Pixel-perfect” pages… Is it reasonable?
“Pixel-perfect” pages… Is it reasonable?
“Pixel-perfect” pages… Is it reasonable?
“Pixel-perfect” pages… Is it reasonable?
“Pixel-perfect” pages… Is it reasonable?
“Pixel-perfect” pages… Is it reasonable?
“Pixel-perfect” pages… Is it reasonable?
“Pixel-perfect” pages… Is it reasonable?
“Pixel-perfect” pages… Is it reasonable?
“Pixel-perfect” pages… Is it reasonable?
“Pixel-perfect” pages… Is it reasonable?
1. Presentation Development Testing
• Front-end developers
– Expected to convert mockups to “pixel perfect”
template pages
• Back-end developers
– Change templates by adding dynamic content
• Test to check if the implemented page is
compliant with the given mockup
• Expected appearance (oracle) –> mockup
2. Regression Debugging
• Changes to code after initial implementation
– E.g.: Refactoring page from <table> based layout
to <div> based layout
• Changes not intended to change appearance
• Change may have direct or indirect impact
• Test for presentation failures and debug to
find responsible HTML elements
• Expected appearance (oracle) -> previous
correct version of the page
3. Standard Debugging
• Make corrective code changes based on bug
reports
– E.g.: Resolve user-reported failures
• Reproduce the failure and debug
• Expected appearance (oracle) -> marked
screenshot with failure area
What is root cause of a presentation
failure?
Root cause
Faulty HTML
element
Faulty visual
property
CSS
property
HTML
attribute
Limitations of Related Approaches
• Manual interaction
– Browser developer tools (e.g.: Firebug)
– Labor-intensive and error-prone
• Selenium, Sikuli
– Require to exhaustively specify correctness invariants
• Cross-browser testing
– Cannot report exact root cause – faulty visual property
• Fighting layout bugs
– Cannot report a root cause and application independent
• DOM differencing
– Techniques such as XBT, GUI differencing, automated oracles
– Assume “golden” version of the page
– Cannot be used if no golden version or DOM has changed
Simple Approach
• Brute force exploration of possible root cause
space
1. Substitute different values for each root cause
2. Compare web page and oracle
3. If same appearance, stop, else continue
• Limitation
– Large universe of possible values
• E.g.: Margin property: [-∞, +∞]
•
Color property: 16 million colors
– Very expensive
New Idea
Use image processing to define root cause
analysis as a search based technique
• Key Insights
1. Image processing defines successful search
•
•
Compare web page and oracle
Correct root cause identified
2. Image processing guides search
•
Fitness functions (E.g. minimizing difference pixels)
Mapping Root Cause Analysis to
Search-based Problem
• Motivations
– Large search space of root causes
– Image processing to define search parameters
– Availability of oracle image -> natural form of
invariant specifications
• Use genetic algorithm
Genetic algorithm
•
•
•
•
•
•
•
Population: Possible values for a visual property
Initial population: Generated randomly
Selection: Linear ranking
Crossover: One point
Mutation: Uniform mutation
Fitness function: Minimize visual differences
Stopping criteria: web page = oracle
Core Idea
• Try different values for a candidate root cause
• Fitness value = compare web page and oracle
• If max. fitness value (web page = oracle)
– Stop
• Else
– Continue search
Example
Oracle
Test web page
• Candidate root cause: <div, padding>
• Population: [-∞, +∞]
• Initial population: {20, 50, 100, …, 0, 5}
Example
Example
Example
Example
Example
Example
Example
Example
Match found!
Example
Correct root
cause found!
Basic Technique
1. Detect presentation failure

Faulty HTML element
2. Find root cause
Faulty visual property

Prior work: WebSee [2]
• Goal: Detect and localize presentation failures
• Input: Test web page, oracle
• Output: Prioritized list of HTML elements
• Phases
1. Detection: Image processing techniques to find visual
differences
2. Localization: Maps HTML elements to visual
differences
3. Result set processing: Prioritizes HTML elements
based on heuristics
Basic Technique
1. Detect presentation failure

Faulty HTML element

Faulty visual property

2. Find root cause
Classification of Visual Properties
• Effective use of search-based techniques
• Define appropriate fitness function
• Based on the impact on rendering of HTML
element
1. Size and Position
2. Color
3. Predefined values
Category 1: Size and Position
• E.g.: margin, padding, height, width
• Numeric values
• Population: [-∞, +∞]
• Fitness function
– Minimize number of difference pixels
– Property value
Number of difference pixels
Example
Oracle
Test web page
Example
•
•
•
•
e = { <div style=“padding: 10px;”>...</div> }
Number of difference pixels = 300
Value = 50px -> No. of difference pixels = 2,100
Value = 2px -> No. of difference pixels = 175
.
.
.
• Value = 5px -> No. of difference pixels = 0
Category 2: Color
• E.g.: text color, background-color, border-color
• Color value
– 140 color names
– 16 million colors (#000000 to #FFFFFF)
• Population: [#000000, #FFFFFF]
• Fitness function
– Minimize number of difference pixels -> not useful
– Determine expected color from oracle -> complex
– Use minimizing color distance
Category 2: Color analysis (… contd.)
• Color distance: Euclidean distance between RGB
• Oracleavg = Compute average color in oracle
• Testavg = Compute average color in test web
page screenshot
• Color distance = dist (Oracleavg, Testavg)
• Property value
color distance
• Final check -> full image comparison
Example
Oracle
Test web page
Example
•
•
•
•
•
•
e = { <div style=“color:#000000;”>...</div> }
Average oracle color = #FFA000
Average test screenshot color = #8E8E8E
Color distance = 369
Value = #FFFFFF -> color distance = 394
Value = #FFF000 -> color distance = 32
.
.
.
• Value = #FF0000 -> color distance = 0
Category 3: Predefined values
• E.g.: font-style, display, font-family, border-style
• Set of discrete predefined values
– font-style = {italic, oblique, normal}
• Exploration method
– No notion “closeness” to guide search
• Genetic algorithm not used
– Use exhaustive exploration
– Not very expensive
• max. 21 elements,
• avg. 5 elements
Experiment
• Evaluate accuracy
• Compare results with random search
• Evaluated for Category 1 and 2 only
• Subject application: Gmail homepage
• Oracle: Gmail homepage screenshot
• Test cases: Seeded faults
Implementation steps
• Goal: Find root cause of presentation failure
• Input:
1. P: Test web page
2. O: oracle
3. E: set of potentially faulty HTML elements
(provided by WebSee)
• Output: Root cause <HTML element, visual
property>
Implementation steps (… contd.)
1. Find possible root cause space
2. Find pool of possibly correct values for each root
cause
3. Use genetic algorithm to select candidate value
4. Substitute selected value in web page
5. Compare web page and oracle
6. If web page = oracle, then return root cause
7. Else, continue
Experimental Procedure
• Total 37 test cases
• Run both, our and random, approaches 5 times on
each test case
= 37 * 5 * 2 = 370 executions
• Limit search space for experiment to run within 24
hours
= 24 hours / 370 ≈ 3.89 min
• Terminate random approach based on genetic
algorithm
Experimental results
Category
RCA
Random Search
Test #
1. Numeric
100%
59%
30
2. Color
100%
37%
7
Total
100%
55%
37
Experimental results
Category
RCA
Random Search
Test #
1. 1.
Numeric
Numeric
100%
59%
30
2. 2.
Color
Color
100%
37%
7
Total
100%
55%
37
• Conclusions
– Validates feasibility of our search-based approach
– Outperform random search
• Threats to validity
– Restriction on the search space
– Small sample of web applications
Future Work
• Improve performance
– Improve search space initialization
• E.g.: For category 1, use sub-image searching
– Prioritize visual properties
•
•
•
•
•
Create a comprehensive search framework
Improve fitness functions
Handle limitation of presence of faulty property
Handle multiple failures
Evaluate several real web applications
Summary
1. Technique for automatic root cause analysis
2. Root cause analysis mapped as a search problem
3. Helpful in debugging presentation failures
4. No HTML/CSS expertise required
5. High accuracy compared to random search
References
1. Front-end Developers Job Postings, URL:
http://www-scf.usc.edu/ spmahaja/frontend-job-postings/, Apr 2014.
2. S. Mahajan and W. G. Halfond. Finding HTML
Presentation Failures Using Image
Comparison Techniques. In submission, 2014.