alexis at notmyidea
Jun 20, 2012, 10:05 AM
Post #5 of 7
On 20/06/2012 17:29, Paul Moore wrote:
Re: Packaging documentation and packaging.pypi API
[In reply to]
>> I wasn't aware of this - I've had a look and my first thought is that
>> the documentation needs completing. At the moment, there's a lot that
>> isn't documented, and we should avoid getting into the same situation
>> as with distutils where people have to use undocumented APIs to get
>> anything done. There are a lot of examples, but not so much formal API
So that's something we definitely want to fix. The code is heavily annotated, and this had been made to generate the documentation automatically with sphinx in the first time, so… that would make no sense to not make it.
This is for the format API documentation, which seems to be easy to hook to sphinx.
I'll also review all the documentation there to make sure that it perfectly makes sense.
> As a specific example, one thing I would like to do is to be able to
> set up a packaging.pypi client object that lets me query and download
> distributions. However, rather than just querying PyPI (the default)
> I'd like to be able to set up a list of locations (PyPI, a local
> server, and maybe some distribution files stored on my PC) and combine
> results from all of them. This differs from the mirror support in that
> I want to combine the lists, not use one as a fallback if the other
> doesn't exist. From the documentation, I can't tell if this is
> possible, or a feature request, or unsupported... (Actually, there's
> not even any documentation saying how the URL(s) in index_url should
> behave, so how exactly do I set up a local repository anyway...?)
that's not something possible out of the box using the crawler the way
they are defined (iow, that's not one supported use case), *but* it's
possible to make a class on top of the existing ones which could provide
this kind of fallback feature. I'm not sure that we want or don't want
that to be a part of packaging.pypi, but that's definitely something
that this API makes possible without too much trouble.
> On a similar note, at some point, crawler.get_releases('pywin32')
> needs to work. I believe the issue here is technically with pywin32,
> which uses non-standard version numbers (214) and is hosted externally
> (Sourceforge) but I'd expect that packaging.pypi should be able to
> access anything that's on PyPI, even if other APIs like
> packaging.version can't deal with them.
If this is not working / following the links that are present in the
cheeseshp then this should be considered a bug.
> Ideally, these would be simply things I'd raise as issues on
> bugs.python.org. But as things stand, such issues aren't getting
> fixed, and we don't move forward. And without the documentation to
> back up a debate, it's hard to argue "X is a bug, Y is a feature
> request, Z is behaving correctly".
Alright, so this is a true documentation issue. I will make it clearer
what packaging.pypi makes and doesn't make possible.
Python-Dev mailing list
Python-Dev [at] python