Login | Register For Free | Help
Search for: (Advanced)

Mailing List Archive: Varnish: Misc

Sharing a cache between multiple Varnish servers

 

 

Varnish misc RSS feed   Index | Next | Previous | View Threaded


n.j.saunders at gmail

Apr 21, 2012, 1:25 AM

Post #1 of 7 (1586 views)
Permalink
Sharing a cache between multiple Varnish servers

Hi all -

A two part question on cache sharing:

a) I've got 3 web servers each with a 3.5Gb memory cache. I'd like them to
share a cache but don't want to use the experimental persistant storage
backend - Are there any other options?

b) We run a cache warming script to ensure a certain set of URL's are
always cached, but at the moment the script requests to all 3 web heads to
ensure cache consistency - I see that Varnish supports PUT operations -
Would it be feasible for the cache warmer to request content from webhead 1
and make a "PUT request to servers 2 & 3? I've searched high and low for
documentation on this but can't find anything.

All help greatly appreciated!

Neil


perbu at varnish-software

Apr 21, 2012, 3:36 AM

Post #2 of 7 (1562 views)
Permalink
Re: Sharing a cache between multiple Varnish servers [In reply to]

Hi Neil.


On Sat, Apr 21, 2012 at 10:25 AM, Neil Saunders <n.j.saunders [at] gmail>wrote:

> Hi all -
>
> A two part question on cache sharing:
>
> a) I've got 3 web servers each with a 3.5Gb memory cache. I'd like them to
> share a cache but don't want to use the experimental persistant storage
> backend - Are there any other options?
>

I don't think what you have in mind would work. Varnish requires an
explicit lock on the files in manages. Sharing a cache between Varnish
instances won't ever work.

What I would recommend you do is to hash incoming requests based on URL so
each time the same URL is hit it is served from the same server. That way
you don't duplicate the content between caches. Varnish can do this, F5's
can do it, haproxy should be able to do this as well.

b) We run a cache warming script to ensure a certain set of URL's are
> always cached, but at the moment the script requests to all 3 web heads to
> ensure cache consistency - I see that Varnish supports PUT operations -
> Would it be feasible for the cache warmer to request content from webhead 1
> and make a "PUT request to servers 2 & 3? I've searched high and low for
> documentation on this but can't find anything.
>

No. Varnish requires a client requesting the data. But my solution above
would take care of that.

--
Per Buer
Phone: +47 21 98 92 61 / Mobile: +47 958 39 117 / Skype: per.buer
*Varnish makes websites fly!*
Whitepapers <http://www.varnish-software.com/whitepapers> |
Video<http://www.youtube.com/watch?v=x7t2Sp174eI> |
Twitter <https://twitter.com/varnishsoftware>


contact at jpluscplusm

Apr 21, 2012, 4:08 AM

Post #3 of 7 (1554 views)
Permalink
Re: Sharing a cache between multiple Varnish servers [In reply to]

On 21 April 2012 09:25, Neil Saunders <n.j.saunders [at] gmail> wrote:
> Hi all -
>
> A two part question on cache sharing:
>
> a) I've got 3 web servers each with a 3.5Gb memory cache. I'd like them to
> share a cache but don't want to use the experimental persistant storage
> backend - Are there any other options?

As Per's said, this isn't possible.

> b) We run a cache warming script to ensure a certain set of URL's are always
> cached, but at the moment the script requests to all 3 web heads to ensure
> cache consistency - I see that Varnish supports PUT operations - Would it be
> feasible for the cache warmer to request content from webhead 1 and make a
> "PUT request to servers 2 & 3? I've searched high and low for documentation
> on this but can't find anything.

If Per's suggestion of hashing content across the caches doesn't fit
in with what you're trying to do, how about this.

Put some logic in your VCL that looks out for a "X-Cache-Warming:
True" (or whatever) request header, which your warming script will
explicitly set.

If this header is present, then use VCL to get Varnish to switch over
to another cache as its backend such that, instead of just going
cache1->origin, the request instead goes
cache1->cache2->cache3->origin. You'll need this logic on N-1 of your
caches (the last one doesn't need to know it's part of this scheme) ,
and it should enable you to make 1 request to warm N caches.

You might even be able abstract this so it works for cache flushes,
too, and not just warming operations.

HTH,
Jonathan
--
Jonathan Matthews
Oxford, London, UK
http://www.jpluscplusm.com/contact.html

_______________________________________________
varnish-misc mailing list
varnish-misc [at] varnish-cache
https://www.varnish-cache.org/lists/mailman/listinfo/varnish-misc


n.j.saunders at gmail

Apr 21, 2012, 6:29 AM

Post #4 of 7 (1557 views)
Permalink
Re: Sharing a cache between multiple Varnish servers [In reply to]

Sent from my iPhone

On 21 Apr 2012, at 12:09, Jonathan Matthews <contact [at] jpluscplusm> wrote:

> On 21 April 2012 09:25, Neil Saunders <n.j.saunders [at] gmail> wrote:
>> Hi all -
>>
>> A two part question on cache sharing:
>>
>> a) I've got 3 web servers each with a 3.5Gb memory cache. I'd like them to
>> share a cache but don't want to use the experimental persistant storage
>> backend - Are there any other options?
>
> As Per's said, this isn't possible.
>
>> b) We run a cache warming script to ensure a certain set of URL's are always
>> cached, but at the moment the script requests to all 3 web heads to ensure
>> cache consistency - I see that Varnish supports PUT operations - Would it be
>> feasible for the cache warmer to request content from webhead 1 and make a
>> "PUT request to servers 2 & 3? I've searched high and low for documentation
>> on this but can't find anything.
>
> If Per's suggestion of hashing content across the caches doesn't fit
> in with what you're trying to do, how about this.
>
> Put some logic in your VCL that looks out for a "X-Cache-Warming:
> True" (or whatever) request header, which your warming script will
> explicitly set.
>
> If this header is present, then use VCL to get Varnish to switch over
> to another cache as its backend such that, instead of just going
> cache1->origin, the request instead goes
> cache1->cache2->cache3->origin. You'll need this logic on N-1 of your
> caches (the last one doesn't need to know it's part of this scheme) ,
> and it should enable you to make 1 request to warm N caches.
>
> You might even be able abstract this so it works for cache flushes,
> too, and not just warming operations.
>
> HTH,
> Jonathan
> --
> Jonathan Matthews
> Oxford, London, UK
> http://www.jpluscplusm.com/contact.html
>
> _______________________________________________
> varnish-misc mailing list
> varnish-misc [at] varnish-cache
> https://www.varnish-cache.org/lists/mailman/listinfo/varnish-misc

Excellent suggestions-Thank you both.

_______________________________________________
varnish-misc mailing list
varnish-misc [at] varnish-cache
https://www.varnish-cache.org/lists/mailman/listinfo/varnish-misc


bedis9 at gmail

Apr 21, 2012, 6:42 AM

Post #5 of 7 (1555 views)
Permalink
Re: Sharing a cache between multiple Varnish servers [In reply to]

>
> What I would recommend you do is to hash incoming requests based on URL
> so each time the same URL is hit it is served from the same server. That
> way you don't duplicate the content between caches. Varnish can do this,
> F5's can do it, haproxy should be able to do this as well.
>

Hey,

Actually any "layer 7" load-balancer can do it.
By the way, HAProxy does it even better than F5 ;)

cheers


rainer at ultra-secure

Apr 21, 2012, 7:47 AM

Post #6 of 7 (1549 views)
Permalink
Re: Sharing a cache between multiple Varnish servers [In reply to]

Am Sat, 21 Apr 2012 09:25:06 +0100
schrieb Neil Saunders <n.j.saunders [at] gmail>:

> Hi all -
>
> A two part question on cache sharing:
>
> a) I've got 3 web servers each with a 3.5Gb memory cache. I'd like
> them to share a cache but don't want to use the experimental
> persistant storage backend - Are there any other options?


I think Java-software like ehcache (or some advanded derivative of
it) can do that.
But AFAIK, it requires close integration with the app that is cached.
I don't think you can just bolt ehcache on top of stuff like you can do
with varnish.


Rainer

_______________________________________________
varnish-misc mailing list
varnish-misc [at] varnish-cache
https://www.varnish-cache.org/lists/mailman/listinfo/varnish-misc


n.j.saunders at gmail

Apr 27, 2012, 9:31 AM

Post #7 of 7 (1523 views)
Permalink
Re: Sharing a cache between multiple Varnish servers [In reply to]

On Sat, Apr 21, 2012 at 3:47 PM, Rainer Duffner <rainer [at] ultra-secure>wrote:

> Am Sat, 21 Apr 2012 09:25:06 +0100
> schrieb Neil Saunders <n.j.saunders [at] gmail>:
>
> > Hi all -
> >
> > A two part question on cache sharing:
> >
> > a) I've got 3 web servers each with a 3.5Gb memory cache. I'd like
> > them to share a cache but don't want to use the experimental
> > persistant storage backend - Are there any other options?
>
>
> I think Java-software like ehcache (or some advanded derivative of
> it) can do that.
> But AFAIK, it requires close integration with the app that is cached.
> I don't think you can just bolt ehcache on top of stuff like you can do
> with varnish.
>
>
> Rainer
>

Hi all -

I've tried implementing the web server chaining suggested above but have
run in to a dead end. Broad configuration:

On all servers except the "last" in the chain I've defined:

backend next_web_server {
.host = "webX.domain.com";
.port = "80";
}

And have added the following to vcl_recv:

if (req.http.X-Cache-Warming ~ "true") {
set req.backend = next_web_server;
set req.hash_always_miss = true;
return(lookup);
}

set req.backend = system;
}

And have the following vcl_hash:

sub vcl_hash {

hash_data(req.url);

if (req.http.host)
{
hash_data(req.http.host);
}

return(hash);
}


My issue is that all "real" requests (i.e. those without the
X-Cache-Warming header following cache warming) are cache missing on all
web servers except the last one (The one thats actually going to the real
backend during cache warming) - It's like the backend name is being
explicitly hashed, but can't see anything that would indicate what's going
on in the documentation.

Any help appreciated!

Ta,

Neil

Varnish misc RSS feed   Index | Next | Previous | View Threaded
 
 


Interested in having your list archived? Contact Gossamer Threads
 
  Web Applications & Managed Hosting Powered by Gossamer Threads Inc.