SourceForge logo
SourceForge logo
Menu

matplotlib-devel

From: Michael D. <md...@st...> - 2010年04月26日 16:59:29
I'm noticing that SVN r8269 is failing a great number of regression 
tests -- with pretty major things like the number of digits in the 
formatter being different. The buildbot seems to be getting the same 
failures I am, but I don't see any buildbot e-mails since Wednesday. 
Does anyone know the source of these errors? It seems to have to do 
with changes to the formatters.
Mike
From: Eric F. <ef...@ha...> - 2010年04月26日 17:10:53
Michael Droettboom wrote:
> I'm noticing that SVN r8269 is failing a great number of regression 
> tests -- with pretty major things like the number of digits in the 
> formatter being different. The buildbot seems to be getting the same 
> failures I am, but I don't see any buildbot e-mails since Wednesday. 
> Does anyone know the source of these errors? It seems to have to do 
> with changes to the formatters.
I suspect I'm the culprit. I will look into it.
Eric
> 
> Mike
> 
> ------------------------------------------------------------------------------
> _______________________________________________
> Matplotlib-devel mailing list
> Mat...@li...
> https://lists.sourceforge.net/lists/listinfo/matplotlib-devel
From: Eric F. <ef...@ha...> - 2010年04月26日 18:37:13
Michael Droettboom wrote:
> I'm noticing that SVN r8269 is failing a great number of regression 
> tests -- with pretty major things like the number of digits in the 
> formatter being different. The buildbot seems to be getting the same 
> failures I am, but I don't see any buildbot e-mails since Wednesday. 
> Does anyone know the source of these errors? It seems to have to do 
> with changes to the formatters.
Mike,
What is the easiest way to run a single test? I want to work on one 
failure at at time.
Eric
> 
> Mike
> 
> ------------------------------------------------------------------------------
> _______________________________________________
> Matplotlib-devel mailing list
> Mat...@li...
> https://lists.sourceforge.net/lists/listinfo/matplotlib-devel
From: Michael D. <md...@st...> - 2010年04月26日 18:40:23
On 04/26/2010 02:37 PM, Eric Firing wrote:
> Michael Droettboom wrote:
>> I'm noticing that SVN r8269 is failing a great number of regression 
>> tests -- with pretty major things like the number of digits in the 
>> formatter being different. The buildbot seems to be getting the same 
>> failures I am, but I don't see any buildbot e-mails since Wednesday. 
>> Does anyone know the source of these errors? It seems to have to do 
>> with changes to the formatters.
>
> Mike,
>
> What is the easiest way to run a single test? I want to work on one 
> failure at at time.
You can provide a dot-separated path to the module and function, eg. 
(this is assuming the test is installed):
nosetests matplotlib.tests.test_simplification:test_clipping
Mike
>
> Eric
>
>
>>
>> Mike
>>
>> ------------------------------------------------------------------------------ 
>>
>> _______________________________________________
>> Matplotlib-devel mailing list
>> Mat...@li...
>> https://lists.sourceforge.net/lists/listinfo/matplotlib-devel
>
From: Eric F. <ef...@ha...> - 2010年04月27日 00:29:03
Michael Droettboom wrote:
> I'm noticing that SVN r8269 is failing a great number of regression 
> tests -- with pretty major things like the number of digits in the 
> formatter being different. The buildbot seems to be getting the same 
> failures I am, but I don't see any buildbot e-mails since Wednesday. 
> Does anyone know the source of these errors? It seems to have to do 
> with changes to the formatters.
Mike,
I have fixed all but one test--in some cases, partly by fixing tests 
and/or the baseline images.
The one exception is figimage. I don't think I have changed anything 
that would affect that in any obvious way, but certainly I could be 
wrong, and I haven't looked into it. The immediate problem is that I 
can't see what the difference is between the baseline and what I am 
generating now. They look identical to my eye, on my screen. The diff 
plot shows up as a black square. The png files are different, differing 
in length by 2 bytes, so something has changed--but what? Given that I 
can't *see* any difference, I am tempted to just change the baseline 
plot so that the test will pass, but maybe that would be a mistake.
Building mpl from a version 2 months ago, I am still seeing the figimage 
failure.
Eric
> 
> Mike
> 
> ------------------------------------------------------------------------------
> _______________________________________________
> Matplotlib-devel mailing list
> Mat...@li...
> https://lists.sourceforge.net/lists/listinfo/matplotlib-devel
From: Michael D. <md...@st...> - 2010年04月27日 14:17:08
Thanks, Eric. I can confirm the failure on figimage, but with the same 
seemingly miniscule difference.
It looks like Jouni wrote that test... Jouni: could you verify that the 
difference isn't significant? If it's not, we can just create a new 
baseline image.
Mike
Eric Firing wrote:
> Michael Droettboom wrote:
>> I'm noticing that SVN r8269 is failing a great number of regression 
>> tests -- with pretty major things like the number of digits in the 
>> formatter being different. The buildbot seems to be getting the same 
>> failures I am, but I don't see any buildbot e-mails since Wednesday. 
>> Does anyone know the source of these errors? It seems to have to do 
>> with changes to the formatters.
>
> Mike,
>
> I have fixed all but one test--in some cases, partly by fixing tests 
> and/or the baseline images.
>
> The one exception is figimage. I don't think I have changed anything 
> that would affect that in any obvious way, but certainly I could be 
> wrong, and I haven't looked into it. The immediate problem is that I 
> can't see what the difference is between the baseline and what I am 
> generating now. They look identical to my eye, on my screen. The diff 
> plot shows up as a black square. The png files are different, 
> differing in length by 2 bytes, so something has changed--but what? 
> Given that I can't *see* any difference, I am tempted to just change 
> the baseline plot so that the test will pass, but maybe that would be 
> a mistake.
>
> Building mpl from a version 2 months ago, I am still seeing the 
> figimage failure.
>
> Eric
>
>
>>
>> Mike
>>
>> ------------------------------------------------------------------------------ 
>>
>> _______________________________________________
>> Matplotlib-devel mailing list
>> Mat...@li...
>> https://lists.sourceforge.net/lists/listinfo/matplotlib-devel
>
-- 
Michael Droettboom
Science Software Branch
Operations and Engineering Division
Space Telescope Science Institute
Operated by AURA for NASA
From: Jouni K. S. <jk...@ik...> - 2010年04月27日 17:45:33
Michael Droettboom <md...@st...> writes:
> Thanks, Eric. I can confirm the failure on figimage, but with the same 
> seemingly miniscule difference.
>
> It looks like Jouni wrote that test... Jouni: could you verify that the 
> difference isn't significant? If it's not, we can just create a new 
> baseline image.
I can't replicate the problem: for me "nosetests
matplotlib.tests.test_image:test_figimage" passes (Python 2.6.1,
matplotlib revision 8276, OS X 10.6.3). Could you post the failing
images somewhere?
-- 
Jouni K. Seppänen
http://www.iki.fi/jks
From: Michael D. <md...@st...> - 2010年04月27日 19:18:13
Jouni K. Seppänen wrote:
> The following message is a courtesy copy of an article
> that has been posted to gmane.comp.python.matplotlib.devel as well.
>
> Jouni K. Seppänen <jks...@pu...> writes:
>
> 
>>> It looks like Jouni wrote that test... Jouni: could you verify that the 
>>> difference isn't significant? If it's not, we can just create a new 
>>> baseline image.
>>> 
>> I can't replicate the problem: for me "nosetests
>> matplotlib.tests.test_image:test_figimage" passes (Python 2.6.1,
>> matplotlib revision 8276, OS X 10.6.3). Could you post the failing
>> images somewhere?
>> 
>
> Thanks for the images Michael. If I change all E1 bytes in the current
> baseline image to E0 bytes, the images match perfectly. Sounds like a
> change in the discretization of floating-point values to integer bytes.
>
> The difference is definitely not significant, but as I said, I get the
> old image on my system. I don't object to changing the baseline image,
> but the best solution would be to make the image diff less sensitive.
> Unfortunately I don't have the time to pursue this now.
> 
Thanks for the analysis. I'll look into the image diffing and see what 
can be done.
Mike
-- 
Michael Droettboom
Science Software Branch
Operations and Engineering Division
Space Telescope Science Institute
Operated by AURA for NASA
From: Michael D. <md...@st...> - 2010年04月27日 19:40:20
I made a small change to the test harness so that the tolerance can be 
set on a per-test basis. I increased the tolerance for the figimage 
test so it passes on my machine.
Cheers,
Mike
Michael Droettboom wrote:
> Jouni K. Seppänen wrote:
> 
>> The following message is a courtesy copy of an article
>> that has been posted to gmane.comp.python.matplotlib.devel as well.
>>
>> Jouni K. Seppänen <jks...@pu...> writes:
>>
>> 
>> 
>>>> It looks like Jouni wrote that test... Jouni: could you verify that the 
>>>> difference isn't significant? If it's not, we can just create a new 
>>>> baseline image.
>>>> 
>>>> 
>>> I can't replicate the problem: for me "nosetests
>>> matplotlib.tests.test_image:test_figimage" passes (Python 2.6.1,
>>> matplotlib revision 8276, OS X 10.6.3). Could you post the failing
>>> images somewhere?
>>> 
>>> 
>> Thanks for the images Michael. If I change all E1 bytes in the current
>> baseline image to E0 bytes, the images match perfectly. Sounds like a
>> change in the discretization of floating-point values to integer bytes.
>>
>> The difference is definitely not significant, but as I said, I get the
>> old image on my system. I don't object to changing the baseline image,
>> but the best solution would be to make the image diff less sensitive.
>> Unfortunately I don't have the time to pursue this now.
>> 
>> 
> Thanks for the analysis. I'll look into the image diffing and see what 
> can be done.
>
> Mike
>
> 
-- 
Michael Droettboom
Science Software Branch
Operations and Engineering Division
Space Telescope Science Institute
Operated by AURA for NASA
From: Jouni K. S. <jk...@ik...> - 2010年04月27日 18:55:37
Jouni K. Seppänen <jk...@ik...> writes:
>> It looks like Jouni wrote that test... Jouni: could you verify that the 
>> difference isn't significant? If it's not, we can just create a new 
>> baseline image.
>
> I can't replicate the problem: for me "nosetests
> matplotlib.tests.test_image:test_figimage" passes (Python 2.6.1,
> matplotlib revision 8276, OS X 10.6.3). Could you post the failing
> images somewhere?
Thanks for the images Michael. If I change all E1 bytes in the current
baseline image to E0 bytes, the images match perfectly. Sounds like a
change in the discretization of floating-point values to integer bytes.
The difference is definitely not significant, but as I said, I get the
old image on my system. I don't object to changing the baseline image,
but the best solution would be to make the image diff less sensitive.
Unfortunately I don't have the time to pursue this now.
-- 
Jouni K. Seppänen
http://www.iki.fi/jks
Want the latest updates on software, tech news, and AI?
Get latest updates about software, tech news, and AI from SourceForge directly in your inbox once a month.
Thanks for helping keep SourceForge clean.
X





Briefly describe the problem (required):
Upload screenshot of ad (required):
Select a file, or drag & drop file here.
Screenshot instructions:

Click URL instructions:
Right-click on the ad, choose "Copy Link", then paste here →
(This may not be possible with some types of ads)

More information about our ad policies

Ad destination/click URL:

AltStyle によって変換されたページ (->オリジナル) /