Login | Register For Free | Help
Search for: (Advanced)

Mailing List Archive: Varnish: Misc

Varnish implicitly hashing backend

 

 

Varnish misc RSS feed   Index | Next | Previous | View Threaded


n.j.saunders at gmail

Apr 30, 2012, 4:49 AM

Post #1 of 2 (303 views)
Permalink
Varnish implicitly hashing backend

Hi all -

I'm attempting to implement a suggestion provided as a solution to another
question I posted regarding cache warming.

Long story short, I have 6 webservers that I'm pre-warming 60,000 urls via
a script. I had previously been sending each request to each web-server,
but it was suggested that it would be much quicker, and indeed more
elegant, to be able to set a header (X-Cache-Warming in this case) that if
set would cause the web-server to use the next web server as the backend,
until it reached the last web server and it would be fetched via the actual
backend: Goal - Make a single request on the first web server to warm all 6.

The issue I'm seeing is following cache warming the I get cache misses on
actual requests on all web servers except the last in the chain, which
would imply to me that Varnish implicitly hashes the backend name used. A
summary of the VCLused:

On all servers except the "last" in the chain I've defined:

backend system {
.host = system1.domain.com
.port=80
}

backend next_web_server {
.host = "webX.domain.com <http://webx.domain.com/>";
.port = "80";
}

And have added the following to vcl_recv for all web servers except the
last:

# On all webservers except the last START
if (req.http.X-Cache-Warming ~ "true") {
set req.backend = next_web_server;
set req.hash_always_miss = true;
return(lookup);
}
# On all webservers except the last END

set req.backend = system;
}

And have the following vcl_hash:

sub vcl_hash {

hash_data(req.url);
hash_data(req.http.host);
return(hash);
}

Any help would be very much appreciated, even if only a "yes this is how
it works, no there's no workaround" :)

Cheers,

Neil


contact at jpluscplusm

Apr 30, 2012, 6:18 AM

Post #2 of 2 (295 views)
Permalink
Re: Varnish implicitly hashing backend [In reply to]

On 30 April 2012 12:49, Neil Saunders <n.j.saunders [at] gmail> wrote:
> Hi all -
>
> I'm attempting to implement a suggestion provided as a solution to another
> question I posted regarding cache warming.
>
> Long story short, I have 6 webservers that I'm pre-warming 60,000 urls via a
> script. I had previously been sending each request to each web-server, but
> it was suggested that it would be much quicker, and indeed more elegant, to
> be able to set a header (X-Cache-Warming in this case) that if set would
> cause the web-server to use the next web server as the backend, until it
> reached the last web server and it would be fetched via the actual backend:
> Goal - Make a single request on the first web server to warm all 6.
>
> The issue I'm seeing is following cache warming the I get cache misses on
> actual requests on all web servers except the last in the chain, which would
> imply to me that Varnish implicitly hashes the backend name used.

I don't have an answer for this, but here's a thought: try explicitly
logging the "req.hash" value as late as possible in the vcl_* chain (I
don't know when/where it's an acceptable variable to query) to see
what it produces.
https://www.varnish-cache.org/docs/3.0/reference/vcl.html#the-hash-director
says it uses just this as the key.

I don't know if it contains an opaque lookup key, or something more
useful. Perhaps comparing a single request's req.hash value between
multiple chained caches in your setup will show something interesting
...

Jonathan
--
Jonathan Matthews
Oxford, London, UK
http://www.jpluscplusm.com/contact.html

_______________________________________________
varnish-misc mailing list
varnish-misc [at] varnish-cache
https://www.varnish-cache.org/lists/mailman/listinfo/varnish-misc

Varnish misc RSS feed   Index | Next | Previous | View Threaded
 
 


Interested in having your list archived? Contact Gossamer Threads
 
  Web Applications & Managed Hosting Powered by Gossamer Threads Inc.