Message140083
| Author |
michael.mulich |
| Recipients |
alexis, eric.araujo, michael.mulich, tarek |
| Date |
2011年07月10日.15:40:47 |
| SpamBayes Score |
3.5216274e-13 |
| Marked as misclassified |
No |
| Message-id |
<1310312448.84.0.700662725694.issue12526@psf.upfronthosting.co.za> |
| In-reply-to |
| Content |
The issue, as best I can describe it, is in how the a release list (packaging.pypi.dist.ReleaseList) looks up releases.
Here is a simple example using a random package on PyPI.
>>> crawler = Crawler()
>>> projects = crawler.search_projects('snimpy')
>>> projects
[<Project "snimpy">]
>>> project = projects[0]
>>> [x for x in project]
[]
The results show that project 'snimpy' has no releases, but this is incorrect because distribution 'snimpy' has five releases.
Even after calling sort_releases and fetch_releases on the project which both refer back to the crawler instance (see the project's _index attribute) the project fails to get the releases.
>>> project.fetch_releases()
[]
>>> project.sort_releases()
>>> [x for x in project]
[]
In order to get the releases, one is forced to use the crawler's
API rather than the resulting project's API.
>>> crawler.get_releases(project.name, force_update=True)
<Project "snimpy" versions: 0.5, 0.4, 0.3, 0.2.1, 0.2>
>>> [x for x in project]
[<snimpy 0.5>, <snimpy 0.4>, <snimpy 0.3>, <snimpy 0.2.1>, <snimpy 0.2>]
So as far as I can gather, We lack the ability to forcibly update the project (or ReleaseList). I don't have a solution at this time, but we may want to look into adding a force_update argument to the get_release method on the Crawler. |
|
History
|
|---|
| Date |
User |
Action |
Args |
| 2011年07月10日 15:40:48 | michael.mulich | set | recipients:
+ michael.mulich, tarek, eric.araujo, alexis |
| 2011年07月10日 15:40:48 | michael.mulich | set | messageid: <1310312448.84.0.700662725694.issue12526@psf.upfronthosting.co.za> |
| 2011年07月10日 15:40:48 | michael.mulich | link | issue12526 messages |
| 2011年07月10日 15:40:47 | michael.mulich | create |
|