You can subscribe to this list here.
2003 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
(1) |
Nov
(33) |
Dec
(20) |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2004 |
Jan
(7) |
Feb
(44) |
Mar
(51) |
Apr
(43) |
May
(43) |
Jun
(36) |
Jul
(61) |
Aug
(44) |
Sep
(25) |
Oct
(82) |
Nov
(97) |
Dec
(47) |
2005 |
Jan
(77) |
Feb
(143) |
Mar
(42) |
Apr
(31) |
May
(93) |
Jun
(93) |
Jul
(35) |
Aug
(78) |
Sep
(56) |
Oct
(44) |
Nov
(72) |
Dec
(75) |
2006 |
Jan
(116) |
Feb
(99) |
Mar
(181) |
Apr
(171) |
May
(112) |
Jun
(86) |
Jul
(91) |
Aug
(111) |
Sep
(77) |
Oct
(72) |
Nov
(57) |
Dec
(51) |
2007 |
Jan
(64) |
Feb
(116) |
Mar
(70) |
Apr
(74) |
May
(53) |
Jun
(40) |
Jul
(519) |
Aug
(151) |
Sep
(132) |
Oct
(74) |
Nov
(282) |
Dec
(190) |
2008 |
Jan
(141) |
Feb
(67) |
Mar
(69) |
Apr
(96) |
May
(227) |
Jun
(404) |
Jul
(399) |
Aug
(96) |
Sep
(120) |
Oct
(205) |
Nov
(126) |
Dec
(261) |
2009 |
Jan
(136) |
Feb
(136) |
Mar
(119) |
Apr
(124) |
May
(155) |
Jun
(98) |
Jul
(136) |
Aug
(292) |
Sep
(174) |
Oct
(126) |
Nov
(126) |
Dec
(79) |
2010 |
Jan
(109) |
Feb
(83) |
Mar
(139) |
Apr
(91) |
May
(79) |
Jun
(164) |
Jul
(184) |
Aug
(146) |
Sep
(163) |
Oct
(128) |
Nov
(70) |
Dec
(73) |
2011 |
Jan
(235) |
Feb
(165) |
Mar
(147) |
Apr
(86) |
May
(74) |
Jun
(118) |
Jul
(65) |
Aug
(75) |
Sep
(162) |
Oct
(94) |
Nov
(48) |
Dec
(44) |
2012 |
Jan
(49) |
Feb
(40) |
Mar
(88) |
Apr
(35) |
May
(52) |
Jun
(69) |
Jul
(90) |
Aug
(123) |
Sep
(112) |
Oct
(120) |
Nov
(105) |
Dec
(116) |
2013 |
Jan
(76) |
Feb
(26) |
Mar
(78) |
Apr
(43) |
May
(61) |
Jun
(53) |
Jul
(147) |
Aug
(85) |
Sep
(83) |
Oct
(122) |
Nov
(18) |
Dec
(27) |
2014 |
Jan
(58) |
Feb
(25) |
Mar
(49) |
Apr
(17) |
May
(29) |
Jun
(39) |
Jul
(53) |
Aug
(52) |
Sep
(35) |
Oct
(47) |
Nov
(110) |
Dec
(27) |
2015 |
Jan
(50) |
Feb
(93) |
Mar
(96) |
Apr
(30) |
May
(55) |
Jun
(83) |
Jul
(44) |
Aug
(8) |
Sep
(5) |
Oct
|
Nov
(1) |
Dec
(1) |
2016 |
Jan
|
Feb
|
Mar
(1) |
Apr
|
May
|
Jun
(2) |
Jul
|
Aug
(3) |
Sep
(1) |
Oct
(3) |
Nov
|
Dec
|
2017 |
Jan
|
Feb
(5) |
Mar
|
Apr
|
May
|
Jun
|
Jul
(3) |
Aug
|
Sep
(7) |
Oct
|
Nov
|
Dec
|
2018 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
(2) |
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
S | M | T | W | T | F | S |
---|---|---|---|---|---|---|
|
|
1
(1) |
2
(15) |
3
(11) |
4
(7) |
5
(9) |
6
(9) |
7
(13) |
8
(6) |
9
(4) |
10
(1) |
11
(6) |
12
|
13
|
14
(2) |
15
|
16
(2) |
17
(5) |
18
|
19
|
20
|
21
|
22
(2) |
23
(4) |
24
(7) |
25
(8) |
26
(5) |
27
(2) |
28
(11) |
29
(6) |
30
(5) |
31
(6) |
|
|
On 03/26/2011 11:12 AM, Jouni K. Seppänen wrote: > The test lib/matplotlib/tests/test_axes.py:test_polar_annotations is > failing on master. It passes in 0a9e86a but fails in the next commit > 8506c33 "Merge branch 'colorcycle' of git://github.com/faucon/matplotlib". > > In the image diff, the text and arrow have shifted a little, but > that's probably due to ftfont differences and is allowed by the > comparison. The failing part is the color of the marker, which has > changed from green to blue. > > Is this an intended consequence of the color-cycle change? Yes, it is consistent with the idea behind the change, which is that generating a plot element with an explicit color specified should not advance the color cycle. The marker is the first plot element with no explicit color, so it should start at the beginning of the cycle, which is blue, not green. Will you commit the new "expected" png file? Sorry I did not catch this. Eric
The test lib/matplotlib/tests/test_axes.py:test_polar_annotations is failing on master. It passes in 0a9e86a but fails in the next commit 8506c33 "Merge branch 'colorcycle' of git://github.com/faucon/matplotlib". In the image diff, the text and arrow have shifted a little, but that's probably due to ftfont differences and is allowed by the comparison. The failing part is the color of the marker, which has changed from green to blue. Is this an intended consequence of the color-cycle change? -- Jouni K. Seppänen http://www.iki.fi/jks
I found that def set_sort_zpos(self,val): '''Set the position to use for z-sorting.''' self._sort_zpos = val was missing from class Line3DCollection(LineCollection) (in matplotlib 1.0.1): Now, with the above method added, add_collection3d works as it should with line segments, and the following will work as expected :-) ''' Purpose: Waterfall plot using LineCollection Author: V.P.S. (2011年03月26日) ''' from mpl_toolkits.mplot3d import Axes3D from matplotlib.collections import LineCollection from matplotlib.colors import colorConverter import matplotlib.pyplot as plt import numpy as np np.random.seed(40040157) # Used to allow repeatable experiments (plots) fig = plt.figure() ax = fig.gca(projection='3d') cc = [colorConverter.to_rgba(c,alpha=0.6) for c in ('r','g','b','c','y','m','k')] ncc = len(cc) nxs = 5 # Num of points to connect (nxs-1 line segments) # Generate X values xs = np.arange(1, nxs+1, 1) # Create array of Y values (to be updated) ys = np.zeros(nxs) # Create list of Z values nzs = 4 # Num of Z-planes zs = [zs+1 for zs in range(nzs)] # Create list of colors (cyclic) for lines colorlist = [cc[j%ncc] for j in range(nzs)] segs = [] # Generate line segments in each Z-plane for j in zs: ys = np.random.rand(nxs) segs.append(zip(xs, ys)) curves = LineCollection(segs, colors = colorlist) ax.add_collection3d(curves, zs=zs, zdir='y') ax.set_xlabel('X') # points to right -- X ax.set_xlim3d(0, nxs+1) ax.set_ylabel('Y') # points into screen -- Y ax.set_ylim3d(0, nzs+1) ax.set_zlabel('Z') # points up -- Z ax.set_zlim3d(0, 1) plt.show() Have a Good Day --V
Blast from the past! I just ran into this and it comes from the fact that 'matplotlib.tests.test_text' is not in the default_test_modules variable inside matplotlib's __init__.py Here's the necessary diff: index 82633a5..649e4d8 100644 --- a/lib/matplotlib/__init__.py +++ b/lib/matplotlib/__init__.py @@ -968,7 +968,8 @@ default_test_modules =3D [ 'matplotlib.tests.test_spines', 'matplotlib.tests.test_image', 'matplotlib.tests.test_simplification', - 'matplotlib.tests.test_mathtext' + 'matplotlib.tests.test_mathtext', + 'matplotlib.tests.test_text' ] I added a pull request for this two line change just in case there was a specific reason to *exclude* test_text from the test modules?=20 For instance, right now, I get one failure in the test suite if I include it. The failure is in test_text:test_font_styles, but this has been the case for a while, it's just that these tests weren't running before. Any developers want to chime in on this? best, -- Paul Ivanov http://pirsquared.org | GPG/PGP key id: 0x0F3E28F7 Michael Droettboom, on 2010年07月27日 11:19, wrote: > Hmm... surprisingly, I am actually able to reproduce this sort of=20 > behaviour here. I'll look into it further. >=20 > Mike >=20 > On 07/27/2010 09:49 AM, Michael Droettboom wrote: > > Of course, we'll prefer to see all of the tests pass... > > > > I'm surprised the two modes of running the tests gives different > > results. Are you sure they are running the same python? Does > > > > python `which nosetests` matplotlib.tests > > > > give you the same result as > > > > nosetests matplotlib.tests > > > > ? > > > > There must be some environmental difference between the two to cause the > > different results. > > > > Mike > > > > On 07/24/2010 05:09 PM, Adam wrote: > > =20 > >> Hello, I have just updated to v1.0.0 and am trying to run the test > >> suite to make sure everything is ok. There seems to be two different > >> suites and I am not sure which is correct/current: > >> > >> $python -c 'import matplotlib; matplotlib.test()' > >> [...snipped output...] > >> Ran 138 tests in 390.991s > >> OK (KNOWNFAIL=3D2) > >> > >> $nosetests matplotlib.tests I get: > >> [...snipped output] > >> Ran 144 tests in 380.165s > >> FAILED (errors=3D4, failures=3D1) > >> > >> Two of these errors are the known failures from above, and the other > >> two are in "matplotlib.tests.test_text.test_font_styles": > >> ImageComparisonFailure: images not close: > >> /home/adam/result_images/test_text/font_styles.png vs. > >> /home/adam/result_images/test_text/expected-font_styles.png (RMS > >> 23.833) > >> ImageComparisonFailure: images not close: > >> /home/adam/result_images/test_text/font_styles_svg.png vs. > >> /home/adam/result_images/test_text/expected-font_styles_svg.png (RMS > >> 12.961) > >> > >> The module that fails is: > >> > >> FAIL: matplotlib.tests.test_mlab.test_recarray_csv_roundtrip > >> ---------------------------------------------------------------------- > >> Traceback (most recent call last): > >> File "/usr/local/lib/python2.6/dist-packages/nose-0.11.4-py2.6.egg/= nose/case.py", > >> line 186, in runTest > >> self.test(*self.arg) > >> File "/usr/local/lib/python2.6/dist-packages/matplotlib/tests/test_= mlab.py", > >> line 24, in test_recarray_csv_roundtrip > >> assert np.allclose( expected['x'], actual['x'] ) > >> AssertionError > >> > >> > >> > >> I am not sure of the importance level of these - but I wanted to ask > >> to see if I should do anything or if they can safely be ignored. > >> > >> Thanks, > >> Adam.
Blast from the past! I just ran into this and it comes from the fact that 'matplotlib.tests.test_text' is not in the default_test_modules variable inside matplotlib's __init__.py Here's the necessary diff: index 82633a5..649e4d8 100644 --- a/lib/matplotlib/__init__.py +++ b/lib/matplotlib/__init__.py @@ -968,7 +968,8 @@ default_test_modules = [ 'matplotlib.tests.test_spines', 'matplotlib.tests.test_image', 'matplotlib.tests.test_simplification', - 'matplotlib.tests.test_mathtext' + 'matplotlib.tests.test_mathtext', + 'matplotlib.tests.test_text' ] I added a pull request for this two line change just in case there was a specific reason to *exclude* test_text from the test modules? For instance, right now, I get one failure in the test suite if I include it. The failure is in test_text:test_font_styles, but this has been the case for a while, it's just that these tests weren't running before. Any developers want to chime in on this? best, -- Paul Ivanov http://pirsquared.org | GPG/PGP key id: 0x0F3E28F7 Michael Droettboom, on 2010年07月27日 11:19, wrote: > Hmm... surprisingly, I am actually able to reproduce this sort of > behaviour here. I'll look into it further. > > Mike > > On 07/27/2010 09:49 AM, Michael Droettboom wrote: > > Of course, we'll prefer to see all of the tests pass... > > > > I'm surprised the two modes of running the tests gives different > > results. Are you sure they are running the same python? Does > > > > python `which nosetests` matplotlib.tests > > > > give you the same result as > > > > nosetests matplotlib.tests > > > > ? > > > > There must be some environmental difference between the two to cause the > > different results. > > > > Mike > > > > On 07/24/2010 05:09 PM, Adam wrote: > > > >> Hello, I have just updated to v1.0.0 and am trying to run the test > >> suite to make sure everything is ok. There seems to be two different > >> suites and I am not sure which is correct/current: > >> > >> $python -c 'import matplotlib; matplotlib.test()' > >> [...snipped output...] > >> Ran 138 tests in 390.991s > >> OK (KNOWNFAIL=2) > >> > >> $nosetests matplotlib.tests I get: > >> [...snipped output] > >> Ran 144 tests in 380.165s > >> FAILED (errors=4, failures=1) > >> > >> Two of these errors are the known failures from above, and the other > >> two are in "matplotlib.tests.test_text.test_font_styles": > >> ImageComparisonFailure: images not close: > >> /home/adam/result_images/test_text/font_styles.png vs. > >> /home/adam/result_images/test_text/expected-font_styles.png (RMS > >> 23.833) > >> ImageComparisonFailure: images not close: > >> /home/adam/result_images/test_text/font_styles_svg.png vs. > >> /home/adam/result_images/test_text/expected-font_styles_svg.png (RMS > >> 12.961) > >> > >> The module that fails is: > >> > >> FAIL: matplotlib.tests.test_mlab.test_recarray_csv_roundtrip > >> ---------------------------------------------------------------------- > >> Traceback (most recent call last): > >> File "/usr/local/lib/python2.6/dist-packages/nose-0.11.4-py2.6.egg/nose/case.py", > >> line 186, in runTest > >> self.test(*self.arg) > >> File "/usr/local/lib/python2.6/dist-packages/matplotlib/tests/test_mlab.py", > >> line 24, in test_recarray_csv_roundtrip > >> assert np.allclose( expected['x'], actual['x'] ) > >> AssertionError > >> > >> > >> > >> I am not sure of the importance level of these - but I wanted to ask > >> to see if I should do anything or if they can safely be ignored. > >> > >> Thanks, > >> Adam.