Ilayaraja N's research while affiliated with PSG College of Technology and other places

Publications (7)

Chapter
Mobile users expect context-aware spatio-temporal data while they use location-based services. Caching data at the mobile client reduces execution time and improves the performance of the services. An efficient cache replacement policy is considered important to retain the useful items and evict the others from the cache. This paper proposes a cach...
Article
Full-text available
Rapid growth in wireless communications gives mobile users the facility to search local emergency services, business events, entertainment activity and other necessary information at anytime from anywhere using any device. Recent research on mobility and spatial–temporal data focuses on techniques for improving data availability and reducing latenc...
Article
Location dependent information system (LDIS) has received more and more attention recently with the wide availability of mobile devices (smart phones, tablet, notebook, iPad etc.). Location dependent queries (LDQ) such as range query, window query and nearest neighbor (NN) query are gaining popularity. One of the most important LDQ is the closest p...
Conference Paper
A new model for semantic caching of location dependent data in mobile environments is proposed. In the proposed model, semantic descriptions are designed to dynamically answer the nearest neighbor queries (NNQ) and range queries (RQ) from the client cache. In order to accurately answer user queries, the concept of partial objects is introduced in t...
Conference Paper
Data caching in mobile clients is an important technique to enhance data availability and to improve data access time. Due to cache size limitations, cache replacement policies are used to find a suitable subset of items for eviction from the cache. In this paper, we study the issues of cache replacement for location-dependent data under a geometri...
Conference Paper
This paper proposes a cache management method that manages the cache content by pre-fetching data items with maximum prefetch score and evicting cache data items with minimum replacement score. The strategy is to pre-fetch the most probable secondary services based on user's query pattern. The client cache is partitioned into three sections to plac...
Article
In this paper, new strategies for prefetching and cache replacement are proposed. The proposed pre-fetching algorithm considers the geographical and semantic adjacency between queried items. The strategy is to pre-fetch the most probable secondary service as a by-product of the execution of a query to a primary service. Association rule mining is u...

Citations

... However, these two strategies do not consider the filtering effect of the lower level cache, and the classification of content may not be accurate enough. At the same time, these two strategies are difficult to adapt to the high dynamic of cache [25]. This paper proposes a dynamic cache replacement strategy based on the dynamic content, popularity and the importance of nodes in topology. ...
... Prefetching is in general used as a support system to data caching in mobile environment. The idea behind prefetching is to store the Point-of-Interest (POI) service information at the mobile client's cache, fetching it from the server, which the user might need in the near future have been studied by [15,16,19,22,23,26]. Prefetching attempts to actively fetch content before users actually request it. ...
... This NN query retrieves the nearest hospital to the user's current location. [18,21,48,49,55] observe that when k objects must be retrieved instead of just the nearest one, they are called kNN queries. NN queries are also distinguished as static and dynamic queries. ...
... In mobile environment pre-fetching is used to prevent network congestion, delays, and latency problems as in [9], [15] and [16]. Mobile applications store the predicted data item in their local cache and in the remote server for back up procedure so that mobile users can use in future. ...
... Chockler et al. [8] preferred to set the cache size equal to the total memory allocated for data cache by the service provider. Ilayaraja et al. [10] set the cache-size to 10% of the database size, which was 50 in number of objects. Dong et al. [9] varied the cache-size based on system administrator"s experience. ...
... Data caching increases the data availability, even if there is no access to the location content provider. In mobile environment, the usage of uplink and downlink channel has to be taken into serious consideration and also the usage of uplink channel increases the communication cost [5,6,13,28,29,33]. ...