You can subscribe to this list here.
2003 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
(1) |
Nov
(33) |
Dec
(20) |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2004 |
Jan
(7) |
Feb
(44) |
Mar
(51) |
Apr
(43) |
May
(43) |
Jun
(36) |
Jul
(61) |
Aug
(44) |
Sep
(25) |
Oct
(82) |
Nov
(97) |
Dec
(47) |
2005 |
Jan
(77) |
Feb
(143) |
Mar
(42) |
Apr
(31) |
May
(93) |
Jun
(93) |
Jul
(35) |
Aug
(78) |
Sep
(56) |
Oct
(44) |
Nov
(72) |
Dec
(75) |
2006 |
Jan
(116) |
Feb
(99) |
Mar
(181) |
Apr
(171) |
May
(112) |
Jun
(86) |
Jul
(91) |
Aug
(111) |
Sep
(77) |
Oct
(72) |
Nov
(57) |
Dec
(51) |
2007 |
Jan
(64) |
Feb
(116) |
Mar
(70) |
Apr
(74) |
May
(53) |
Jun
(40) |
Jul
(519) |
Aug
(151) |
Sep
(132) |
Oct
(74) |
Nov
(282) |
Dec
(190) |
2008 |
Jan
(141) |
Feb
(67) |
Mar
(69) |
Apr
(96) |
May
(227) |
Jun
(404) |
Jul
(399) |
Aug
(96) |
Sep
(120) |
Oct
(205) |
Nov
(126) |
Dec
(261) |
2009 |
Jan
(136) |
Feb
(136) |
Mar
(119) |
Apr
(124) |
May
(155) |
Jun
(98) |
Jul
(136) |
Aug
(292) |
Sep
(174) |
Oct
(126) |
Nov
(126) |
Dec
(79) |
2010 |
Jan
(109) |
Feb
(83) |
Mar
(139) |
Apr
(91) |
May
(79) |
Jun
(164) |
Jul
(184) |
Aug
(146) |
Sep
(163) |
Oct
(128) |
Nov
(70) |
Dec
(73) |
2011 |
Jan
(235) |
Feb
(165) |
Mar
(147) |
Apr
(86) |
May
(74) |
Jun
(118) |
Jul
(65) |
Aug
(75) |
Sep
(162) |
Oct
(94) |
Nov
(48) |
Dec
(44) |
2012 |
Jan
(49) |
Feb
(40) |
Mar
(88) |
Apr
(35) |
May
(52) |
Jun
(69) |
Jul
(90) |
Aug
(123) |
Sep
(112) |
Oct
(120) |
Nov
(105) |
Dec
(116) |
2013 |
Jan
(76) |
Feb
(26) |
Mar
(78) |
Apr
(43) |
May
(61) |
Jun
(53) |
Jul
(147) |
Aug
(85) |
Sep
(83) |
Oct
(122) |
Nov
(18) |
Dec
(27) |
2014 |
Jan
(58) |
Feb
(25) |
Mar
(49) |
Apr
(17) |
May
(29) |
Jun
(39) |
Jul
(53) |
Aug
(52) |
Sep
(35) |
Oct
(47) |
Nov
(110) |
Dec
(27) |
2015 |
Jan
(50) |
Feb
(93) |
Mar
(96) |
Apr
(30) |
May
(55) |
Jun
(83) |
Jul
(44) |
Aug
(8) |
Sep
(5) |
Oct
|
Nov
(1) |
Dec
(1) |
2016 |
Jan
|
Feb
|
Mar
(1) |
Apr
|
May
|
Jun
(2) |
Jul
|
Aug
(3) |
Sep
(1) |
Oct
(3) |
Nov
|
Dec
|
2017 |
Jan
|
Feb
(5) |
Mar
|
Apr
|
May
|
Jun
|
Jul
(3) |
Aug
|
Sep
(7) |
Oct
|
Nov
|
Dec
|
2018 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
(2) |
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
S | M | T | W | T | F | S |
---|---|---|---|---|---|---|
|
|
|
1
(6) |
2
(5) |
3
|
4
|
5
|
6
(4) |
7
(3) |
8
|
9
(5) |
10
|
11
|
12
(1) |
13
|
14
|
15
(4) |
16
(1) |
17
(4) |
18
(1) |
19
(4) |
20
(4) |
21
(5) |
22
(1) |
23
(3) |
24
(6) |
25
(1) |
26
(19) |
27
(13) |
28
(9) |
|
|
|
|
Starting from CVS mpl, I tested examples/quadmesh_demo.py and found that it worked only with numarray; Numeric and numpy refused to cast the Float64 X and Y arrays to Float32. So I fixed that in CVS by using the astype method. Now the demo works with numpy and numarray, but segfaults with Numeric. I have not tried to track the problem down; it is very low priority for me. Given the momentum towards numpy, it might not be worth tracking down at all. System is Linux; >>> Numeric.__version__ '24.0b2' Eric
Sounds good to me. I guess when this is done I need an example rc file and a little info on how it should work. On 2/9/06, Darren Dale <dd...@co...> wrote: > On Thursday 09 February 2006 11:31, you wrote: > > I think there should be some global variable that makes it easy to > > scale all the fonts, but how to do that and still give the user fine > > control is challenging. There are 4 or 5 places where I have changed > > the default font size for tickmarks, legend entries, etc. I have some > > set at 18 pt, some at 16, and some at 14. I don't know if there is an > > easy way to do all of this and make it possible to shift everything my > > adjusting one setting. Does it make sense to allow the user to set > > offset values - so that legend font is main-4 and tickmarks are > > main-2? > > This is what I am trying to accomplish, using the existing scaling framew= ork. > Rather than "main + or - X", it multiplies main by one of the following: > > {'xx-small': 0.579, 'x-small': 0.694, 'small': 0.833, > 'medium': 1.0, 'large': 1.200, 'x-large': 1.440, > 'xx-large': 1.728} > > Darren >
On Thursday 09 February 2006 11:31, you wrote: > I think there should be some global variable that makes it easy to > scale all the fonts, but how to do that and still give the user fine > control is challenging. There are 4 or 5 places where I have changed > the default font size for tickmarks, legend entries, etc. I have some > set at 18 pt, some at 16, and some at 14. I don't know if there is an > easy way to do all of this and make it possible to shift everything my > adjusting one setting. Does it make sense to allow the user to set > offset values - so that legend font is main-4 and tickmarks are > main-2? This is what I am trying to accomplish, using the existing scaling framework. Rather than "main + or - X", it multiplies main by one of the following: {'xx-small': 0.579, 'x-small': 0.694, 'small': 0.833, 'medium': 1.0, 'large': 1.200, 'x-large': 1.440, 'xx-large': 1.728} Darren
On Thursday 09 February 2006 11:22, you wrote: > >>>>> "Darren" == Darren Dale <dd...@co...> writes: > > Darren> I just want to double check before I commit this. We offer > Darren> a font.size rc setting, and users can modify that size by > Darren> setting fontsize='medium' or 'large', etc. However, > Darren> font.size does not globally set the default font size, to > Darren> axis labels, ticklabels, etc, they remain 12pt as > Darren> default. Should this be changed? If so, the change is > Darren> simple: from this: def __init__(self, size=12.0, > Darren> weight='normal'): to this: def __init__(self, > Darren> size=rcParams['font.size'], weight='normal'): > > If you want to use rc defaults for kwargs, you do not want to use them > in the function definition, because then they will be set a module > load time and the defaults cannot be changed dynamically. Rather, you > want to use this idiom (eg lines.py) > > > def __init__(self, xdata, ydata, > linewidth = None, # all Nones default to rc > ...): > if linewidth is None : linewidth=rcParams['lines.linewidth'] > > Then if the user changes the rc param value, the constructor default > changes too. Ok. So currently a user can put font.size:23.0, or font.size:medium or even "large". The latter makes things very confusing, because who knows what the reference point is? (Its 12, but its not clear.) I propose the following change: font.defaultsize demands a point size, and then allow the relative font sizes to scale that size. Also, add a text.fontsize rc parameter to allow such scaling for text (this is what font.size does now, I think). Are these changes ok, or should I just leave everything alone? Darren
>>>>> "Darren" == Darren Dale <dd...@co...> writes: Darren> I just want to double check before I commit this. We offer Darren> a font.size rc setting, and users can modify that size by Darren> setting fontsize='medium' or 'large', etc. However, Darren> font.size does not globally set the default font size, to Darren> axis labels, ticklabels, etc, they remain 12pt as Darren> default. Should this be changed? If so, the change is Darren> simple: from this: def __init__(self, size=12.0, Darren> weight='normal'): to this: def __init__(self, Darren> size=rcParams['font.size'], weight='normal'): If you want to use rc defaults for kwargs, you do not want to use them in the function definition, because then they will be set a module load time and the defaults cannot be changed dynamically. Rather, you want to use this idiom (eg lines.py) def __init__(self, xdata, ydata, linewidth = None, # all Nones default to rc ...): if linewidth is None : linewidth=rcParams['lines.linewidth'] Then if the user changes the rc param value, the constructor default changes too. JDH
I just want to double check before I commit this. We offer a font.size rc setting, and users can modify that size by setting fontsize='medium' or 'large', etc. However, font.size does not globally set the default font size, to axis labels, ticklabels, etc, they remain 12pt as default. Should this be changed? If so, the change is simple: from this: def __init__(self, size=12.0, weight='normal'): to this: def __init__(self, size=rcParams['font.size'], weight='normal'): Darren
John Hunter wrote: > def add_collection(self, coll, autolim=False) > > ...or something to that effect. > If offsets are defined, one could, as Chris suggested, just use the > offsets to compute the limits. This would be "close enough" to get > most of the data into the axes, and the user could then tweak to their > heart's content. it would give something useful to most users.My vote would be to make this the default, rather than nothing. It wouldn't be any more expensive than for the other plotting commands, and Having struggled with this, I know it would have taken me a while to find an autolim keyword, even if it had been there. Of course, It would have been a lot simpler if cla() had worked right, which Eric has fixed. Perhaps I'm too focused on my use case, however. Any way you do it is quite usable. -Chris -- Christopher Barker, Ph.D. Oceanographer NOAA/OR&R/HAZMAT (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chr...@no...
John, OK, I will put in a kwarg and support as many cases as reasonably possible. I was part-way there this weekend, but pulled back after running into the problem. Eric John Hunter wrote: >>>>>>"Eric" == Eric Firing <ef...@ha...> writes: > > Eric> Because collections have verts and offsets, which may have > Eric> separate transformations, it is not possible, in general, to > Eric> convert the plotted points to data units (which is what we > Eric> would need) until the viewLim is set; but that defeats the > Eric> purpose, which is to update the dataLim so that it can be > Eric> used to calculate viewLim. > > Hey Eric, thanks for laying out the problem so clearly. The only way > for different transforms (eg the vert and offset transform) to > coordinate is to first convert to the common space (display/pixels) > and then invert back to their respective coordinates. And this isn't > possible if the viewlim aren't set, as you note. It's been a while > since I mucked around with the get_verts method in the collections > module, but I had the feeling that I had left it incomplete and your > post clarifies that. > > Eric> The related bug is that the collection get_verts() methods > Eric> all simply add the offset to the verts and return the sums, > Eric> which are quite meaningless unless transform and transOffset > Eric> are identical; and even if they are identical, the units of > Eric> get_verts() will depend on the transform. > > Eric> Options for adding some automatic dataLim updating option to > Eric> add_collection include: > > Eric> 1) Don't even try. Simply require it to be done manually. > Eric> Make notes in docstrings and/or elsewhere. > > Eric> 3) Do a partial job of it also in the case where transData > > I think we can support auto-limits for collections reasonably > coherently. I'm inclined to leave the default behavior unchanged for > efficiency (no auto-limits) but to support a kwarg that does > auto-scaling > > def add_collection(self, coll, autolim=False) > > ...or something to that effect. > > If autolim is True, even given the problems you pointed out above, we > can support it in many common use cases. > > There are two cases with no ambiguity that we could definitely support > > * if no offsets are specified just use the verts > > * if offsets are specified and the vert and offset transforms are > the same it is straightforward to calculate the limits > > If offsets are defined, one could, as Chris suggested, just use the > offsets to compute the limits. This would be "close enough" to get > most of the data into the axes, and the user could then tweak to their > heart's content. In this case we might want to warn. > > JDH > > > >
>>>>> "Eric" == Eric Firing <ef...@ha...> writes: Eric> Because collections have verts and offsets, which may have Eric> separate transformations, it is not possible, in general, to Eric> convert the plotted points to data units (which is what we Eric> would need) until the viewLim is set; but that defeats the Eric> purpose, which is to update the dataLim so that it can be Eric> used to calculate viewLim. Hey Eric, thanks for laying out the problem so clearly. The only way for different transforms (eg the vert and offset transform) to coordinate is to first convert to the common space (display/pixels) and then invert back to their respective coordinates. And this isn't possible if the viewlim aren't set, as you note. It's been a while since I mucked around with the get_verts method in the collections module, but I had the feeling that I had left it incomplete and your post clarifies that. Eric> The related bug is that the collection get_verts() methods Eric> all simply add the offset to the verts and return the sums, Eric> which are quite meaningless unless transform and transOffset Eric> are identical; and even if they are identical, the units of Eric> get_verts() will depend on the transform. Eric> Options for adding some automatic dataLim updating option to Eric> add_collection include: Eric> 1) Don't even try. Simply require it to be done manually. Eric> Make notes in docstrings and/or elsewhere. Eric> 3) Do a partial job of it also in the case where transData I think we can support auto-limits for collections reasonably coherently. I'm inclined to leave the default behavior unchanged for efficiency (no auto-limits) but to support a kwarg that does auto-scaling def add_collection(self, coll, autolim=False) ...or something to that effect. If autolim is True, even given the problems you pointed out above, we can support it in many common use cases. There are two cases with no ambiguity that we could definitely support * if no offsets are specified just use the verts * if offsets are specified and the vert and offset transforms are the same it is straightforward to calculate the limits If offsets are defined, one could, as Chris suggested, just use the offsets to compute the limits. This would be "close enough" to get most of the data into the axes, and the user could then tweak to their heart's content. In this case we might want to warn. JDH
Eric Firing wrote: >> Which is what we have now, yes? > > Yes, except for the "notes in docstrings and/or elsewhere". Yes. Notes are always good. >>> 2) Do it only if the transforms and offsets are such that it does not >>> depend on the viewLim; otherwise raise an exception. >>> >>> 3) Do a partial job of it also in the case where transData is the >>> transform for either of verts of offsets, and simply ignore the >>> effect of whichever of these is not using transData. >> >> I'd be happy if dataLim was calculated from just the offsets. Is that >> option 3? adding option 2 would be fine too, but wouldn't help me any. > > Option 3 includes and extends option 2, and does cover your use case. > What it does *not* do is ensure that the result of autoscaling is that > you can see everything you are trying to plot. To some extent this is a > problem even now; a line marker that happens to land in a corner (such > as the origin) doesn't show up very well, because only 1/4 of it is in > the plotted domain. Exactly, and I think that's fine. > To get around this problem, one could put in yet another rc param that > would ensure a margin is added to the viewLim as calculated from the > dataLim. So, for example, plot([0,1], [0,1]) would have axis limits > from -0.2 to 1.2 (or something like that) instead of 0 to 1, and a big > red marker at each point would still be fully within the plotted domain. I think an rc param is a bad idea. there really is no way to do this in the general case, as you have NO idea how big people's markers, etc are going to be in data units. I also think it's no big deal to have part of a marker chopped off. > A strategy question, of course, is how much of this sort of > sophistication should be built in to the automatic plotting, versus > simply available, as is the case now, with manual commands. This > concern with excessive complexity is the reason I am not enthusiastic > about option 3 (or 2); I agree. I still vote for my option: just calculate the dataLim from the offsets. That's it. It's easy, and it covers the basics. > getting the right result manually is pretty easy > once one knows how to do it. And, chances are, for a final product one > would end up using the manual methods anyway so as to have full control. True, but it would be nice if the defaults gave you dataLims that were at least in the ballpark for your data, rather than arbitrary. It'd be easier for people to see the plot and think: "I need to tweak the axis limits" Thanks for you work on this. -Chris -- Christopher Barker, Ph.D. Oceanographer NOAA/OR&R/HAZMAT (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chr...@no...
Chris, Christopher Barker wrote: > Eric Firing wrote: > >> I fixed the bug in which ax.cla() was not actually causing the >> existing dataLim to be ignored. It did not require an extra flag; it >> only required taking advantage of the ignore=-1 option of >> dataLim.update. The fixed version is in CVS. > > > Wonderful. Thanks. > >> Because collections have verts and offsets, which may have separate >> transformations, it is not possible, in general, to convert the >> plotted points to data units (which is what we would need) until the >> viewLim is set; but that defeats the purpose, which is to update the >> dataLim so that it can be used to calculate viewLim. > > > >> 1) Don't even try. Simply require it to be done manually. Make notes >> in docstrings and/or elsewhere. > > > Which is what we have now, yes? Yes, except for the "notes in docstrings and/or elsewhere". > >> 2) Do it only if the transforms and offsets are such that it does not >> depend on the viewLim; otherwise raise an exception. >> >> 3) Do a partial job of it also in the case where transData is the >> transform for either of verts of offsets, and simply ignore the effect >> of whichever of these is not using transData. > > > I'd be happy if dataLim was calculated from just the offsets. Is that > option 3? adding option 2 would be fine too, but wouldn't help me any. Option 3 includes and extends option 2, and does cover your use case. What it does *not* do is ensure that the result of autoscaling is that you can see everything you are trying to plot. To some extent this is a problem even now; a line marker that happens to land in a corner (such as the origin) doesn't show up very well, because only 1/4 of it is in the plotted domain. To get around this problem, one could put in yet another rc param that would ensure a margin is added to the viewLim as calculated from the dataLim. So, for example, plot([0,1], [0,1]) would have axis limits from -0.2 to 1.2 (or something like that) instead of 0 to 1, and a big red marker at each point would still be fully within the plotted domain. A strategy question, of course, is how much of this sort of sophistication should be built in to the automatic plotting, versus simply available, as is the case now, with manual commands. This concern with excessive complexity is the reason I am not enthusiastic about option 3 (or 2); getting the right result manually is pretty easy once one knows how to do it. And, chances are, for a final product one would end up using the manual methods anyway so as to have full control. > > That's because in my case, I'm using a LineCollection to create > something kind of like a marker, so the offsets really do define the > dataLim. Perhaps that's not universal enough to include, however. It > would be easy. > > -Chris > > Eric
Eric Firing wrote: > I fixed the bug in which ax.cla() was not actually causing the existing > dataLim to be ignored. It did not require an extra flag; it only > required taking advantage of the ignore=-1 option of dataLim.update. The > fixed version is in CVS. Wonderful. Thanks. > Because collections have verts and offsets, which may have separate > transformations, it is not possible, in general, to convert the plotted > points to data units (which is what we would need) until the viewLim is > set; but that defeats the purpose, which is to update the dataLim so > that it can be used to calculate viewLim. > 1) Don't even try. Simply require it to be done manually. Make notes > in docstrings and/or elsewhere. Which is what we have now, yes? > 2) Do it only if the transforms and offsets are such that it does not > depend on the viewLim; otherwise raise an exception. > > 3) Do a partial job of it also in the case where transData is the > transform for either of verts of offsets, and simply ignore the effect > of whichever of these is not using transData. I'd be happy if dataLim was calculated from just the offsets. Is that option 3? adding option 2 would be fine too, but wouldn't help me any. That's because in my case, I'm using a LineCollection to create something kind of like a marker, so the offsets really do define the dataLim. Perhaps that's not universal enough to include, however. It would be easy. -Chris -- Christopher Barker, Ph.D. Oceanographer NOAA/OR&R/HAZMAT (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chr...@no...
John and Chris, In reference to my suggestion: > 1) Use a flag instead of the have_data() method to keep track of > whether data limit updating needs to start from scratch. Then > axes.cla() can set the flag, and the update_datalim* functions can > clear it. > > 2) Add an optional flag to add_collection, telling it to call the > collection's get_verts method and use the result to update the data > limits. This would make it easier to use collections in user-level > code, without imposing any performance penalty for functions like > contour that handle the data limit updating in a more efficient way. I fixed the bug in which ax.cla() was not actually causing the existing dataLim to be ignored. It did not require an extra flag; it only required taking advantage of the ignore=-1 option of dataLim.update. The fixed version is in CVS. I was not able to implement the second part of the strategy, however, which was to put an optional flag into axes.add_collection telling it to update the dataLim. The reason is quite fundamental, and I think it reveals some additional bugs. Because collections have verts and offsets, which may have separate transformations, it is not possible, in general, to convert the plotted points to data units (which is what we would need) until the viewLim is set; but that defeats the purpose, which is to update the dataLim so that it can be used to calculate viewLim. The related bug is that the collection get_verts() methods all simply add the offset to the verts and return the sums, which are quite meaningless unless transform and transOffset are identical; and even if they are identical, the units of get_verts() will depend on the transform. Options for adding some automatic dataLim updating option to add_collection include: 1) Don't even try. Simply require it to be done manually. Make notes in docstrings and/or elsewhere. 2) Do it only if the transforms and offsets are such that it does not depend on the viewLim; otherwise raise an exception. 3) Do a partial job of it also in the case where transData is the transform for either of verts of offsets, and simply ignore the effect of whichever of these is not using transData. I am leaning towards (1); it is not clear to me that the effort involved in the other options would be justified by the convenience gained. Regarding the bug in get_verts(), the options include: 1) Add bug warnings to the docstrings. 2) Raise an exception if it is not possible to unambiguously determine the plotted points in data coordinates. 3) Remove the functions entirely on the grounds that they are dangerously misleading. Most likely I have missed something important in all this. Eric
On Thursday 02 February 2006 15:26, Ed Schofield wrote: > Hi all, > > Matplotlib CVS has a bug in the validate_key() function in __init__.py > that prevents it from corrrectly handling unrecognized entries in the > matplotlibrc file. The attached patch fixes it by passing the > (previously undefined) variable fname. Thanks, this is fixed in cvs.
Hi all, Matplotlib CVS has a bug in the validate_key() function in __init__.py that prevents it from corrrectly handling unrecognized entries in the matplotlibrc file. The attached patch fixes it by passing the (previously undefined) variable fname. -- Ed Index: __init__.py =================================================================== RCS file: /cvsroot/matplotlib/matplotlib/lib/matplotlib/__init__.py,v retrieving revision 1.114 diff -r1.114 __init__.py 885c885 < def validate_key(key, val, line, cnt, fail_on_error): --- > def validate_key(key, val, line, cnt, fname, fail_on_error): 949c949 < cval = validate_key(key, val, line, cnt, fail_on_error) --- > cval = validate_key(key, val, line, cnt, fname, fail_on_error) 954c954 < cval = validate_key(key, val, line, cnt, fail_on_error) --- > cval = validate_key(key, val, line, cnt, fname, fail_on_error)
Currently when plotting dates the tick mark locations are only generated once, so when zooming in on a plot takes you between tick marks, they dissapear. Attatched is a patch file (against the cvs repository as of 10:40am, Feb 02, 2006) that has an AutoDateLocator class that derives from the DateLocator class. It essentially does what AutoLocator does, except for Dates. In addition I also provided an AutoDateFormatter class that inherits from Formatter and will chande the format of the date label based on the range of dates being spanned in the current view. The AutoDateFormatter class may not be as robust as it could be, but it might be nice to have it around. The attatched patch file will set the default behaviour of date plots to use both AutoDateLocator and AutoDateFormatter. --James Evans
On Thursday 02 February 2006 09:05, Darren Dale wrote: > My current mpl cvs checkout wont build. I guess this is a problem > with my OS (gentoo), although I just built numpy and scipy without > problems. Could anyone suggest why > - L / u s r / l i b 6 4 / t k 8 . 4 / . . / > appears on line 2 instead of > -L/usr/lib64/tk8.4/../ I found the problem. In setupext.py, line 320, tk_lib is formatted as unicode on my machine. I changed this: o.tk_lib = os.path.join((tk.getvar('tk_library')), '../') to this: o.tk_lib = os.path.join(str(tk.getvar('tk_library')), '../') and I can build mpl again. changes in cvs.
My current mpl cvs checkout wont build. I guess this is a problem with my OS (gentoo), although I just built numpy and scipy without problems. Could anyone suggest why - L / u s r / l i b 6 4 / t k 8 . 4 / . . / appears on line 2 instead of -L/usr/lib64/tk8.4/../ Thanks, Darren x86_64-pc-linux-gnu-gcc: src/_tkagg.cpp x86_64-pc-linux-gnu-g++ -pthread -shared build/temp.linux-x86_64-2.4/src/_tkagg.o build/temp.linux-x86_64-2.4/CXX/cxx_extensions.o build/temp.linux-x86_64-2.4/CXX/cxxsupport.o build/temp.linux-x86_64-2.4/CXX/IndirectPythonInterface.o build/temp.linux-x86_64-2.4/CXX/cxxextensions.o -L/usr/lib64/tcl8.4/../ - L / u s r / l i b 6 4 / t k 8 . 4 / . . / -L/usr/local/lib -L/usr/lib -L/usr/local/lib -L/usr/lib -ltk8.4 -ltcl8.4 -lpng -lz -lstdc++ -lm -lfreetype -lz -lstdc++ -lm -o build/lib.linux-x86_64-2.4/matplotlib/backends/_tkagg.so x86_64-pc-linux-gnu-g++: L: No such file or directory x86_64-pc-linux-gnu-g++: u: No such file or directory x86_64-pc-linux-gnu-g++: s: No such file or directory x86_64-pc-linux-gnu-g++: r: No such file or directory x86_64-pc-linux-gnu-g++: l: No such file or directory x86_64-pc-linux-gnu-g++: i: No such file or directory x86_64-pc-linux-gnu-g++: b: No such file or directory x86_64-pc-linux-gnu-g++: 6: No such file or directory x86_64-pc-linux-gnu-g++: 4: No such file or directory x86_64-pc-linux-gnu-g++: t: No such file or directory x86_64-pc-linux-gnu-g++: k: No such file or directory x86_64-pc-linux-gnu-g++: 8: No such file or directory x86_64-pc-linux-gnu-g++: 4: No such file or directory x86_64-pc-linux-gnu-g++: -E or -x required when input is from standard input x86_64-pc-linux-gnu-g++: L: No such file or directory x86_64-pc-linux-gnu-g++: u: No such file or directory x86_64-pc-linux-gnu-g++: s: No such file or directory x86_64-pc-linux-gnu-g++: r: No such file or directory x86_64-pc-linux-gnu-g++: l: No such file or directory x86_64-pc-linux-gnu-g++: i: No such file or directory x86_64-pc-linux-gnu-g++: b: No such file or directory x86_64-pc-linux-gnu-g++: 6: No such file or directory x86_64-pc-linux-gnu-g++: 4: No such file or directory x86_64-pc-linux-gnu-g++: t: No such file or directory x86_64-pc-linux-gnu-g++: k: No such file or directory x86_64-pc-linux-gnu-g++: 8: No such file or directory x86_64-pc-linux-gnu-g++: 4: No such file or directory x86_64-pc-linux-gnu-g++: -E or -x required when input is from standard input error: Command "x86_64-pc-linux-gnu-g++ -pthread -shared build/temp.linux-x86_64-2.4/src/_tkagg.o build/temp.linux-x86_64-2.4/CXX/cxx_extensions.o build/temp.linux-x86_64-2.4/CXX/cxxsupport.o build/temp.linux-x86_64-2.4/CXX/IndirectPythonInterface.o build/temp.linux-x86_64-2.4/CXX/cxxextensions.o -L/usr/lib64/tcl8.4/../ - L / u s r / l i b 6 4 / t k 8 . 4 / . . / -L/usr/local/lib -L/usr/lib -L/usr/local/lib -L/usr/lib -ltk8.4 -ltcl8.4 -lpng -lz -lstdc++ -lm -lfreetype -lz -lstdc++ -lm -o build/lib.linux-x86_64-2.4/matplotlib/backends/_tkagg.so" failed with exit status 1
John, Thanks very much. I had missed the fact that the ignore argument can take three values, not two, so I will take that into account. As usual, I might not finish the changes until the weekend. Eric John Hunter wrote: >>>>>>"Eric" == Eric Firing <ef...@ha...> writes: > > > Eric> If this strategy sounds reasonable to you, I can go ahead > Eric> and implement it. > > This looks fine; FYI I'll include a post I started in response to your > earlier email but failed to push send; this provides a little context > > To: Eric Firing <ef...@ha...> > Cc: Christopher Barker <Chr...@no...>, > mat...@li... > Subject: Re: [Matplotlib-users] Apparent bug in Data limits with LineCollections > From: John Hunter <jdh...@ac...> > > Eric> I would like to make a genuine bugfix, but I do not yet > Eric> understand all this well enough to do so right now. Maybe > Eric> John will chime in with a good solution. > > Just a comment for now. If you look at ax.add_collection, it does not > update the datalim. This is by design but it should be documented. > The reason I didn't add it was collecitons were meant to be fast > (they've failed a little bit on that front but they aren't > mind-numbingly slow) and so I left it to the user to set the datalim > manually since this is potentially expensive and the user often knows > the lim for one reason or another. See the finance.py module for > several instances on how to set the data lim with collections. Eg, > > > minx, maxx = (0, len(rangeSegments)) > miny = min([low for low in lows if low !=-1]) > maxy = max([high for high in highs if high != -1]) > > corners = (minx, miny), (maxx, maxy) > ax.update_datalim(corners) > ax.autoscale_view() > > > As for how the datalim handling works, the syntax is > > self.dataLim.update(xys, ignore) > > Note this is different than the ax.update_datalim method, which calls > it. datalim is a bbox which has an ignore state variable (boolean). > > The ignore argument to update datalim can take on three values > > 0: do not ignore the current limits and update them with the xys > 1: ignore the current datalim limits and override with xys > -1: use the datalim ignore state to determine the ignore settings > > This seems a bit complex but arose from experience. Basically a lot > of different objects want to add their data to the datalim. In most > use cases, you want the first object to add data to ignore the current > limits (which are just default values) and subsequent objects to add > to the datalim taking into account the previous limits. The default > behavior of datalim is to set ignore to 1, and after the first call > with -1 set ignore to 0. Thus everyone can call with -1 and have the > desired default behavior . I hope you are all confused now. > > One can manually set the ignore state var with > > datalim.ignore(1) > > Cheers, > JDH > > > > ------------------------------------------------------- > This SF.net email is sponsored by: Splunk Inc. Do you grep through log files > for problems? Stop! Download the new AJAX search engine that makes > searching your log files as easy as surfing the web. DOWNLOAD SPLUNK! > http://sel.as-us.falkag.net/sel?cmd=lnk&kid=103432&bid=230486&dat=121642 > _______________________________________________ > Matplotlib-devel mailing list > Mat...@li... > https://lists.sourceforge.net/lists/listinfo/matplotlib-devel
>>>>> "Eric" == Eric Firing <ef...@ha...> writes: Eric> If this strategy sounds reasonable to you, I can go ahead Eric> and implement it. This looks fine; FYI I'll include a post I started in response to your earlier email but failed to push send; this provides a little context To: Eric Firing <ef...@ha...> Cc: Christopher Barker <Chr...@no...>, mat...@li... Subject: Re: [Matplotlib-users] Apparent bug in Data limits with LineCollections From: John Hunter <jdh...@ac...> Eric> I would like to make a genuine bugfix, but I do not yet Eric> understand all this well enough to do so right now. Maybe Eric> John will chime in with a good solution. Just a comment for now. If you look at ax.add_collection, it does not update the datalim. This is by design but it should be documented. The reason I didn't add it was collecitons were meant to be fast (they've failed a little bit on that front but they aren't mind-numbingly slow) and so I left it to the user to set the datalim manually since this is potentially expensive and the user often knows the lim for one reason or another. See the finance.py module for several instances on how to set the data lim with collections. Eg, minx, maxx = (0, len(rangeSegments)) miny = min([low for low in lows if low !=-1]) maxy = max([high for high in highs if high != -1]) corners = (minx, miny), (maxx, maxy) ax.update_datalim(corners) ax.autoscale_view() As for how the datalim handling works, the syntax is self.dataLim.update(xys, ignore) Note this is different than the ax.update_datalim method, which calls it. datalim is a bbox which has an ignore state variable (boolean). The ignore argument to update datalim can take on three values 0: do not ignore the current limits and update them with the xys 1: ignore the current datalim limits and override with xys -1: use the datalim ignore state to determine the ignore settings This seems a bit complex but arose from experience. Basically a lot of different objects want to add their data to the datalim. In most use cases, you want the first object to add data to ignore the current limits (which are just default values) and subsequent objects to add to the datalim taking into account the previous limits. The default behavior of datalim is to set ignore to 1, and after the first call with -1 set ignore to 0. Thus everyone can call with -1 and have the desired default behavior . I hope you are all confused now. One can manually set the ignore state var with datalim.ignore(1) Cheers, JDH
John, Chris Barker found a problem: plotting in an axes, then calling axes.cla, then adding a collection, then calling axes.plot, results in the original plot's dataLim being used as the starting point for the update. I think the problems are: 1) axes.add_line updates the data limits, but add_collection does not; 2) axes.has_data is simply looking to see whether a line or collection has been added, but is using that as an indication of whether the data limits have been set; this is invalid because add_collection does not set the limits. I suggest two changes to address the problem: 1) Use a flag instead of the have_data() method to keep track of whether data limit updating needs to start from scratch. Then axes.cla() can set the flag, and the update_datalim* functions can clear it. 2) Add an optional flag to add_collection, telling it to call the collection's get_verts method and use the result to update the data limits. This would make it easier to use collections in user-level code, without imposing any performance penalty for functions like contour that handle the data limit updating in a more efficient way. If this strategy sounds reasonable to you, I can go ahead and implement it. Eric
On 2006年1月31日, Robert Kern wrote: > Andrew Straw wrote: > > > Now, mostly jokingly, how 'bout a MPL-like VTK interface?? Somehow the > > idea of "borrowing" the MPL interface has playful allure, no? ;) > > *Which* MPL interface? The OO API or the pylab API? Does Prabhu's work with tvtk > strike any chords? E.g. > > http://svn.enthought.com/svn/enthought/trunk/src/lib/enthought/tvtk/tools/mlab.py Also have a look at the screenshot at http://www.enthought.com/enthought/wiki/TVTK (and http://www.enthought.com/enthought/wiki/MayaVi) IMHO: before starting to think about how to incorporate 3D in MPL one should spend some time with the excellent tvtk and MayaVi2 (which is in very good shape already). In case someone wants to try out tvtk and MayaVi2, http://www.scipy.org/ArndBaecker/MayaVi2 might be helpful. Best, Arnd