Using Semantic Caching to Manage Location
Download
Report
Transcript Using Semantic Caching to Manage Location
Using Semantic Caching
to Manage Location Dependent Data
in Mobile Computing
2003.3.18
CS 744
Database Lab.
Se-Kyoung Huh
Copyright © 2003 KAIST All Rights Reserved.
1
Contents
Background
Semantic Cache
Modeling LDD Query
LDD Semantic Cache Index
LDD Query Processing
LDD Cache Management
Experiment
Conclusion
Copyright © 2003 KAIST All Rights Reserved.
2
Background
Characteristic of mobile computing
Large overlapped results for continuous queries
Disconnected situation
Advantage of caching data for mobile computing
Wireless network traffic cost down
System performance up
Copyright © 2003 KAIST All Rights Reserved.
3
Semantic Cache
Semantic Cache vs. Page Cache
Semantic Cache
Page Cache
Granularity
Query result
Database tuples or pages
Content
Semantic information + result
result
Advantage of semantic cache for LDD (Location
Dependent Data)
Strong semantic locality than spatial locality for semantic
LDD application
Possibility of flexible cache management
Use of semantic information in disconnection situations
Copyright © 2003 KAIST All Rights Reserved.
4
Modeling LDD Query
Q = “Give me the names of the hotels within 20
miles whose prices are below $100”
Qp = (price < 100) ∩ (Lx-20 < xposition <= Lx+20)
∩ (Ly-20 <= yposition < Ly+20)
(Lx,Ly) : current user position
Assumption : reference point is given
Dependent on the current user position
Copyright © 2003 KAIST All Rights Reserved.
5
LDD Semantic Cache Index
Semantic information
S
S1
SR
SA
Table
Attribute
Hotel
Hname
SP
Predicate
(Lx-5 <=hxpos<=Lx+5)∩
Index for cache result
SL
SC
Bound
position
STS
Time
Stamp
10,20
2
T1
-5,15
5
T2
-5,-20
8
T3
(Ly-5 <=hypos <=Ly+5)
S2
Rest.
Rname,
(Lx-10<=rxpos<=Lx+10)∩
Type
(Ly-10 <=rypos <=Ly+10)∩
(6<=sched<=9)
S3
Hotel
Hname,
(Lx-5 <=hxpos<=Lx+5) ∩
Vacancy
(Ly-5<=hypos<=Ly+5) ∩
(Price<100)
Copyright © 2003 KAIST All Rights Reserved.
6
LDD Query Processing
Relationship between query and cache
If query is contained by cache
Use cache for query processing
If query is partly contained by cache
Split the query into
The query satisfied by cache
» by checking through all segment in the cache
The query not satisfied by cache
Send only the query not satisfied by cache to server
Coalesce every partial query result
Add new query result into cache
Need for decomposition of segments to prevent duplicated
cache segment
Copyright © 2003 KAIST All Rights Reserved.
7
LDD Cache Management
Replacement principle
Incorporation of the
status of the mobile
user
The moving direction
The distance from cache
segment
Copyright © 2003 KAIST All Rights Reserved.
8
LDD Cache Management (cont’d)
FAR algorithm
Divide cache segment
In Direction set
Segment in the user’s moving direction
Out Direction set
Segment not in the user’s moving direction
Choose the victim among the Out Direction set
If Out Direction set is empty
Choose the victim the furthest segment in In Direction set
Copyright © 2003 KAIST All Rights Reserved.
9
Experiment
Page Caching vs. Semantic Caching
Database is neither indexed nor clustered
Semantic caching is better
Due to the highly reduced wireless network traffic
Only the required data is transferred
Copyright © 2003 KAIST All Rights Reserved.
10
Experiment (cont’d)
Page Caching vs. Semantic Caching (cont’d)
index on x, column-wise scan clustering
Page caching becomes better than one in no index
database
Due to not necessity of scanning database for finding page
Copyright © 2003 KAIST All Rights Reserved.
11
Experiment (cont’d)
Page Caching vs. Semantic Caching (cont’d)
index on x, column-wise scan clustering
Page Caching is sensitive for the organization of
database
Copyright © 2003 KAIST All Rights Reserved.
12
Experiment (cont’d)
Comparison of several replacement policy
FAR is better than LRU or MRU
Copyright © 2003 KAIST All Rights Reserved.
13
Conclusion
Contribution
Propose semantic cache concept for mobile
computing
Weakness
Cache replacement policy
Always possible for predicting user’s movement direction?
Copyright © 2003 KAIST All Rights Reserved.
14