jared.williams1 at ntlworld
Jun 5, 2008, 5:16 PM
Post #7 of 8
> -----Original Message-----
> From: wikitech-l-bounces [at] lists
> [mailto:wikitech-l-bounces [at] lists] On Behalf Of
> Tim Starling
> Sent: 05 June 2008 17:52
> To: wikitech-l [at] lists
> Subject: Re: [Wikitech-l] Password hash format
> Ilmari Karonen wrote:
> > Tim Starling wrote:
> >> So I've committed a hash format change, so that salted
> hashes look like this:
> >> :B:a2a5c3b9:44ebabb085ce78dd20c2d59c51e4080c
> >> That is, a type "B" hash, with salt "a2a5c3b9" and MD5 hash
> >> "44ebabb085ce78dd20c2d59c51e4080c". The way the salt and
> the password
> >> are mixed together is the same as in the old system, so you can
> >> migrate to this password format simply by prepending :B:
> and the user ID.
> >> Unsalted hashes look like this:
> >> :A:d41d8cd98f00b204e9800998ecf8427e
> >> Password hashes will be migrated to the :A: or :B: style
> on upgrade,
> >> and after that, you'll be able to switch $wgPasswordSalt
> on and off
> >> at will, and all passwords will continue to work. Before
> upgrade, the
> >> software will understand the old-style hashes, but it requires
> >> $wgPasswordSalt to be set correctly.
> >> When we eventually migrate from MD5 to Tiger/192 or
> whatever, we can
> >> introduce a type "C" hash and then convert the old hashes
> at our leisure.
> > While we're at it, why not introduce some form of adjustable key
> > strengthening:
> > http://en.wikipedia.org/wiki/Key_strengthening
> > For example, we could store the hashes like this:
> > :C:65000:e2a6fb36:37860b819a5e8e71538370316f06a8db
> > where 65000 is the number of times to iterate the statement $hash =
> > md5($hash . $salt . $password) to compute the hash (starting with
> > $hash = ""). The value for new passwords would be set through a
> > config variable (e.g. $wgPasswordIterations), with old
> records updated
> > to use the current iteration count on login.
> Well, I did consider it, back in 2003, the tradeoff of course
> is speed.
> Because we're working in PHP, an attacker could do the same
> operation several times faster than we could, using C/C++.
> Serving web pages is meant to be fast, with lots of
> concurrent requests, and there might be a need to do batch
> operations. There's probably an argument for stretching it
> out to a few milliseconds, but with 65000 iterations I get
> 130ms on zwinger which is probably going a bit too far.
> -- Tim Starling
Quite a few projects seem to be standardising on
It has stretching, and the number of iterations are stored within
the hash, so the strength of the hashes can be upgraded over time.
Wikitech-l mailing list
Wikitech-l [at] lists