Dr

Dr. V. M. Thakare3
Professor & HOD, Computer Sci. and Engg. Department,
SGBAU, Amravati
Dr . S S.SherekarProfessor, Computer Department,SGBAU Amravati
Anil DudheResearch Student
SGBAU, Amravati
CACHING AND COMPRESSION IMPROVES EFFICIENCY OF MOBILE CLOUD COMPUTING

Abstract: Internet becomes the essential part of human life as information related to everything we required is available on Internet. Today most of the people use internet more than 15-20 hours every day. Due to that network traffic is very high so but user is expecting high speed to access internet services or web services particularly on Mobile devices which they keep with them do their task. The Thing is how to minimize the load of server and increase the latency of network. Here web caching or caching replacement policy algorithm plays an important part. In this paper we are investigating effective caching algorithm for improving the efficiency of internet services or web services.
Keywords: web caching, HTTP Caching, caching, Proxy caching
Introduction
Internet becomes the essential part of human life as information related to everything we required is available on Internet. Today most of the people use internet more than 15-20 hours every day. Due to that network traffic is increasing tremendously and user is expecting high speed to access internet services or web services particularly on Mobile devices which they keep with them do their task. So users are facing ever-increasing hold-ups as well as downfalls in data delivery. Web caching also called as HTTP Caching is one of the key components which have been investigated to further improve overall performance. Web Caching is the extensively used technique, used by ISPs all over the world, to save bandwidth and to improve user’s request response time. There are various web caching techniques or algorithms available so question is which algorithm is best suitable for what kind of services and how to determine what is cached, where it is stored, how long it will be preserved and many more questions arises here.
Researchers are working on many things to improve to Internet performance from several aspects; following are the two important things:
Compression
Caching
In this paper we are focusing on caching. Web caching provides several benefits. As we know when we sent request to a particular data request has to travel between several nodes before it arrives at the user’s terminal. In this situation if we keep cache close to the user, the data has to travel this way only once because all further requests to it can be served from the cache only and not from the original server. This will automatically minimizes, the load of the server, thus reduces the response time for the users request. It also reduces bandwidth usage, as request will be not served from the originating server so traveling will be omitted. When the cache is placed closed to the user, the travel time of the signal is also reduced. Therefore, a cache can help in reducing bandwidth usage, response time, travel time, server load and latency, especially as it is perceived by the user.

Best services for writing your paper according to Trustpilot

Premium Partner
From $18.00 per page
4,8 / 5
4,80
Writers Experience
4,80
Delivery
4,90
Support
4,70
Price
Recommended Service
From $13.90 per page
4,6 / 5
4,70
Writers Experience
4,70
Delivery
4,60
Support
4,60
Price
From $20.00 per page
4,5 / 5
4,80
Writers Experience
4,50
Delivery
4,40
Support
4,10
Price
* All Partners were chosen among 50+ writing services by our Customer Satisfaction Team

WEB CACHING TECHNIQUES
Browser cache: Cache which is saved on client machine or client browser is called as Browser cache or client side cache. Whenever browser gets request it will be first searched in its cache if available will be served from its cache only to save to the time otherwise will be served from original server.
Proxy cache: If user has configured proxy server and if the requested data is already cached, the users request is served by the proxy cache. If requested data is not available in the cache, the proxy gets them from another cache or from the original server itself. Proxy has certain drawbacks:
If the proxy fails, all client browsers using it are forced to reconfigure it.
For a proxy cache, server does not support scalability in case user traffic increases.

To deal with these issues, either a proxy with higher performance has to be used, or traffic has to be diverted on several proxies.

Transparent proxy cache: A transparent cache sits between user and web server and provides multiple services; caching, redirection, authentication, auto selection of popular content etc. It gives more benefits when a large group of people is expected to open the same page, such as the homepage of a popular newspaper or a employment related website.
Reverse proxy cache: It cache static as well as dynamic contents and reduces the load of server. It also optimizes the content by compressing it in order to speed up loading times.

CACHE CONTROL
HTTP controls cache by three ways: freshness, validation, and invalidation.5Freshness 
The Freshness of cache says whether cached requested data is still in use or not. It will be decided by the parameters like expire date indicates document becomes stale, or max-age directive tells the cache how many seconds the response is fresh for.

Validation 
Validation is used to check whether a cached response is still validate or not after it becomes stale. Validation is implemented by using the HYPERLINK “https://en.wikipedia.org/wiki/HTTP_ETag” o “HTTP ETag”ETag header and Last-Modified header.

Invalidation 
Cache invalidation is a process which checks whether entries in the cache are removed or replaced. It generally pushes new content to a client. The cache is invalidated, if the client requests the cache, they are delivered a new version.

CACHING ALGORITHMS
Sr. No. Algorithm name Authors Pros Cons Where to use
1 Belady’s Algorithm 2 Least Recently Used To identify the page to replace, you need to find the minimum time stamp value in all the registers… The Least Recently algorithm replaces the page that has not been used for the longest period of time. It is based on the observation that pages that have not been used for long time will probably remain unused for the longest time and are to be replaced.

PERFORMANCE METRICS AND FACTORS Various metrics and factors affect the decision to select apt caching policy for an environment. To deliver maximum efficiency, it is helpful to assess performance of various algorithms based on the factors and metrics relevant to the environment. 3.1 Performance metrics 3.1.1 Hit ratio Hit ratio is generally the ratio of objects obtained through caching policy versus the number of requests made 2. Higher hit ratio indicates better caching policy. However, it may only be relevant if the objects are homogenous in size. 3.1.2 Byte hit ratio Byte hit ratio is the ratio of bytes accessed from the cache to the total bytes accessed 2. In case of objects being of heterogeneous sizes, Byte hit ratio is better metric for measurement. 3.1.3 Bandwidth utilization It is an important count where an algorithm that reduces consumption of bandwidth is better 2. 3.1.4 User response time It is the amount of time a user waits for the system to retrieve the requested object 2. It is also known as latency. 3.1.5 Cache server CPU and I/O utilization The fraction of total available CPU cycles or disk and object retrieval latency 2.Latency is inversely proportional to object hit ratio because a cache hit can be catered more swiftly than a request to origin server. However, optimizing one metric does not imply that it will optimize other too. For example, increasing hit rate does not essentially minimize latency 2.

You Might Also Like
x

Hi!
I'm Alejandro!

Would you like to get a custom essay? How about receiving a customized one?

Check it out