You can subscribe to this list here.
2003 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
(1) |
Nov
(33) |
Dec
(20) |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2004 |
Jan
(7) |
Feb
(44) |
Mar
(51) |
Apr
(43) |
May
(43) |
Jun
(36) |
Jul
(61) |
Aug
(44) |
Sep
(25) |
Oct
(82) |
Nov
(97) |
Dec
(47) |
2005 |
Jan
(77) |
Feb
(143) |
Mar
(42) |
Apr
(31) |
May
(93) |
Jun
(93) |
Jul
(35) |
Aug
(78) |
Sep
(56) |
Oct
(44) |
Nov
(72) |
Dec
(75) |
2006 |
Jan
(116) |
Feb
(99) |
Mar
(181) |
Apr
(171) |
May
(112) |
Jun
(86) |
Jul
(91) |
Aug
(111) |
Sep
(77) |
Oct
(72) |
Nov
(57) |
Dec
(51) |
2007 |
Jan
(64) |
Feb
(116) |
Mar
(70) |
Apr
(74) |
May
(53) |
Jun
(40) |
Jul
(519) |
Aug
(151) |
Sep
(132) |
Oct
(74) |
Nov
(282) |
Dec
(190) |
2008 |
Jan
(141) |
Feb
(67) |
Mar
(69) |
Apr
(96) |
May
(227) |
Jun
(404) |
Jul
(399) |
Aug
(96) |
Sep
(120) |
Oct
(205) |
Nov
(126) |
Dec
(261) |
2009 |
Jan
(136) |
Feb
(136) |
Mar
(119) |
Apr
(124) |
May
(155) |
Jun
(98) |
Jul
(136) |
Aug
(292) |
Sep
(174) |
Oct
(126) |
Nov
(126) |
Dec
(79) |
2010 |
Jan
(109) |
Feb
(83) |
Mar
(139) |
Apr
(91) |
May
(79) |
Jun
(164) |
Jul
(184) |
Aug
(146) |
Sep
(163) |
Oct
(128) |
Nov
(70) |
Dec
(73) |
2011 |
Jan
(235) |
Feb
(165) |
Mar
(147) |
Apr
(86) |
May
(74) |
Jun
(118) |
Jul
(65) |
Aug
(75) |
Sep
(162) |
Oct
(94) |
Nov
(48) |
Dec
(44) |
2012 |
Jan
(49) |
Feb
(40) |
Mar
(88) |
Apr
(35) |
May
(52) |
Jun
(69) |
Jul
(90) |
Aug
(123) |
Sep
(112) |
Oct
(120) |
Nov
(105) |
Dec
(116) |
2013 |
Jan
(76) |
Feb
(26) |
Mar
(78) |
Apr
(43) |
May
(61) |
Jun
(53) |
Jul
(147) |
Aug
(85) |
Sep
(83) |
Oct
(122) |
Nov
(18) |
Dec
(27) |
2014 |
Jan
(58) |
Feb
(25) |
Mar
(49) |
Apr
(17) |
May
(29) |
Jun
(39) |
Jul
(53) |
Aug
(52) |
Sep
(35) |
Oct
(47) |
Nov
(110) |
Dec
(27) |
2015 |
Jan
(50) |
Feb
(93) |
Mar
(96) |
Apr
(30) |
May
(55) |
Jun
(83) |
Jul
(44) |
Aug
(8) |
Sep
(5) |
Oct
|
Nov
(1) |
Dec
(1) |
2016 |
Jan
|
Feb
|
Mar
(1) |
Apr
|
May
|
Jun
(2) |
Jul
|
Aug
(3) |
Sep
(1) |
Oct
(3) |
Nov
|
Dec
|
2017 |
Jan
|
Feb
(5) |
Mar
|
Apr
|
May
|
Jun
|
Jul
(3) |
Aug
|
Sep
(7) |
Oct
|
Nov
|
Dec
|
2018 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
(2) |
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
S | M | T | W | T | F | S |
---|---|---|---|---|---|---|
|
|
1
(22) |
2
(1) |
3
|
4
(2) |
5
|
6
(1) |
7
(14) |
8
(3) |
9
(4) |
10
|
11
(5) |
12
(1) |
13
(4) |
14
(1) |
15
(1) |
16
(8) |
17
(28) |
18
(48) |
19
(18) |
20
(19) |
21
(33) |
22
(11) |
23
(18) |
24
(29) |
25
(36) |
26
(18) |
27
(23) |
28
(19) |
29
(9) |
30
(7) |
31
(16) |
|
|
Ryan May wrote: > Hi, > > I'll continue my current flood of emails to the list. :) I'm finally > getting back to my work on Skew-T plots, and I have a semi-working > implementation (attached.) It runs, and as is, plots up some of the > grid, with the x-value grid lines skewed 45 degrees to the right (as > they should be.) The problem is as you resize the plot horizontally, > some weird things happen. First, some of the lines that start overlaid > end up separating as you expand the plot. The difference is between > what is added using ax.plot and what is added using ax.vlines. The > former adds Line2D objects while the latter adds a LineCollection which > is holding path objects. I'm really not sure what's going on there. I'm > not done checking it out yet, but I'm curious if anyone has any ideas > off the top of their head. > > The second issue, which is more pressing, is when you resize vertically, > the axes limits of the plot don't change (good), but unfortunately the > lines don't stay connected to their lower y-coordinate in data space > (bad). I'm really needing to draw things in a coordinate system that's > independant of the data scale but also doesn't depend on the aspect > ratio of the axes, so that I can get lines (and data plots) where the x > gridlines are always at a 45 degree angle and the lower Y-value point > stays fixed. By problem right now is that while I can find the lower > left corner in pixel space and use that to do the proper adjustments, > this changes when you resize. This changing is especially important in > the y-direction. What I need is either of: > > 1) Axes space adjusted for aspect ratio (and updated with resizes) > 2) Pixel space relative to some corner of the axes > > Or something similar that I don't know about. Any thoughts, or do I > just need to come up with some magical combination of transforms that > works? You can see what I have so far in my attached file. > Ryan, based only on your description of the problems, and of what you need, I think the answer, or at least part of it, may be in good old quiver. Look at the way the transform is being generated depending on the units chosen, and note that preserving a specified angle on the page is part of it. Also note that the transform has to be regenerated on resize events, so a custom draw method is required. (Mike D. translated my original quiver code to use his transforms framework.) It seems like there should be an easier-to-use and more general way to do these sorts of things, and maybe there is--or maybe it can be ginned up. This reminds me of a thread a long time ago regarding adding hooks so that classes could register methods to be executed before their artists are rendered but after things like window and axes sizes have been determined. Eric
Hi, I'll continue my current flood of emails to the list. :) I'm finally getting back to my work on Skew-T plots, and I have a semi-working implementation (attached.) It runs, and as is, plots up some of the grid, with the x-value grid lines skewed 45 degrees to the right (as they should be.) The problem is as you resize the plot horizontally, some weird things happen. First, some of the lines that start overlaid end up separating as you expand the plot. The difference is between what is added using ax.plot and what is added using ax.vlines. The former adds Line2D objects while the latter adds a LineCollection which is holding path objects. I'm really not sure what's going on there. I'm not done checking it out yet, but I'm curious if anyone has any ideas off the top of their head. The second issue, which is more pressing, is when you resize vertically, the axes limits of the plot don't change (good), but unfortunately the lines don't stay connected to their lower y-coordinate in data space (bad). I'm really needing to draw things in a coordinate system that's independant of the data scale but also doesn't depend on the aspect ratio of the axes, so that I can get lines (and data plots) where the x gridlines are always at a 45 degree angle and the lower Y-value point stays fixed. By problem right now is that while I can find the lower left corner in pixel space and use that to do the proper adjustments, this changes when you resize. This changing is especially important in the y-direction. What I need is either of: 1) Axes space adjusted for aspect ratio (and updated with resizes) 2) Pixel space relative to some corner of the axes Or something similar that I don't know about. Any thoughts, or do I just need to come up with some magical combination of transforms that works? You can see what I have so far in my attached file. Thanks in advance, Ryan -- Ryan May Graduate Research Assistant School of Meteorology University of Oklahoma
John Hunter wrote: > On Mon, Jul 21, 2008 at 11:35 PM, Ryan May <rm...@gm...> wrote: >> Hi, >> >> Has anyone ever thought about creating a TextCollection class? The >> purpose would be similar to the other collections, to group a bunch of >> text objects with similar properties. This probably couldn't inherit >> from Collection as of now though, since Collection assumes things like >> edgecolor and facecolor. The bigger question to me is, could the >> backends make use of this to any improvement? Or would this simply >> serve as an API to eliminate having to loop yourself (which would pretty >> much make this useless). >> >> My own personal use case is (once again) in meteorology, where we do >> station plots. This involves printing the actual value of observed >> variables relative to the location of the station. This isn't hard to >> do right now (especially since I have offset_copy back, thanks Mike!). >> I just wasn't sure if the batch functionality of a Collection might >> serve some purpose to the users at large. > > I've thought of it many times and it would definitely be useful, eg > for tick labels. Treating every label as a separate instance > definitely slows things down. > Ok, good to know. I'll put it on my todo list then. Do you think this can inherit from Collection at all? It seemed like a lot of the methods in the Collection base class were specific to polygons or other geometry and don't really make sense in the case of text. Anyone else have thoughts on how this should be implemented? Ryan -- Ryan May Graduate Research Assistant School of Meteorology University of Oklahoma
On Mon, Jul 21, 2008 at 04:42:39PM -0700, Ted Drain wrote: > The public layer would just do conversions and then pass through to the > private layer. Any code in the public layer would have to concern itself > with possible different types (numpy vs lists, units vs floats, color names > vs rgb). Any code written in the private layer could be assured of having > some specific input types and would be much easier to write. Keep in mind that public units need to be tied to font in order to get nice looking results. I commented on this earlier: http://www.nabble.com/font-troubles-td16601826.html#a16601826 I'm not sure all the necessary information to transform to pixels will be available at the time the units are parsed, and we may need to carry them into the private layer. > Of course this would be a lot work and would require refactoring axis and > maybe some of the collections. In theory it should be possible, but only > you guys can decide if it's actually worthwhile or not. One of the things I miss from Tcl/Tk is the ability to use units on values. The link above shows that you can simulate units from outside, but the code is ugly. - Paul
On Tue, Jul 22, 2008 at 6:30 AM, Jeff Whitaker <js...@fa...> wrote: > Chris: I've now added a griddata function to matplotlib.mlab that uses > Robert Kern's scikit.delaunay code (which is now included in matplotlib > as matplotlib.delaunay). The more bulletproof natgrid code, with the > dubious license, is included as a toolkit (mpl_toolkits.natgrid), which > griddata is configured to automatically use if installed. Jeff, thanks for the extra effort to do it this way -- I know it was a pain. But at least now we get * commercial users can rely on our license as iron-clad * griddata will work transparently out of the box for regular users * we provide a path to the more bullet proof code for those who need it I have a few comments I'll include below. * Let's move the try/except natgrid/griddata import to the griddata function itself so users not using griddata will not have to pay for the import, since this will likely be 99% of the mpl users * Expose griddata to the pylab interface and add it to the pylab and mlab module doc strings * We should provide some help for those who may want to try the natgrid code, eg if you plan on releasing it on the sf site as a toolkit, which I think is best, then we can link to the download page in the docstring. If not, perhaps just provide an svn checkout line for folks. * Let's report which package is being used at the verbose helpful level, preferably with some version info if it is available. When questions come in on the mailing list later, we will want to know which package griddata is using. You might set a flag on the griddata function along the lines of def griddata(blah) if not griddata._reported: if _use_natgrid: verbose.report('using natgrid version blah') else: verbose.report('using delaunay version blah') natgrid._reported = True griddata._reported = False * After the next release, let's remember to update the cookbook entry - http://www.scipy.org/Cookbook/Matplotlib/Gridding_irregularly_spaced_data Anyway, this is a great piece of additional functionality that we've literally been waiting years for, so thanks for taking the extra time to do it so thoroughly. And enterprising developers everywhere, it would still be extremely useful to follow Robert's suggestions to improve the delaunay code along the lines discussed in this thread earlier. Not for the faint of heart, but users for generations to come will thank you. JDH
On Mon, Jul 21, 2008 at 11:35 PM, Ryan May <rm...@gm...> wrote: > Hi, > > Has anyone ever thought about creating a TextCollection class? The > purpose would be similar to the other collections, to group a bunch of > text objects with similar properties. This probably couldn't inherit > from Collection as of now though, since Collection assumes things like > edgecolor and facecolor. The bigger question to me is, could the > backends make use of this to any improvement? Or would this simply > serve as an API to eliminate having to loop yourself (which would pretty > much make this useless). > > My own personal use case is (once again) in meteorology, where we do > station plots. This involves printing the actual value of observed > variables relative to the location of the station. This isn't hard to > do right now (especially since I have offset_copy back, thanks Mike!). > I just wasn't sure if the batch functionality of a Collection might > serve some purpose to the users at large. I've thought of it many times and it would definitely be useful, eg for tick labels. Treating every label as a separate instance definitely slows things down. JDH
Christopher Barker wrote: > arrg! > > When am I going to learn not to click "send" until after I've read the > entire thread! > > Jeff Whitaker wrote: > >> John: I just contacted NCAR again, and it seems that they have >> relicensed the software under an OSI-based license similar to the >> University of Illinois/NCSA: >> > ... > >> What do you think? If it's OK I say we use the natgrid package in >> matplotlib, since it's more bulletproof than the scikits package (it >> passes Robert's degenerate triangulation test, and has been pounded on >> by user of NCAR graphics since the 1980's). >> > > that would be nice, but while it is a good solution to the re-gridding > problem, it doesn't appear to provide a general purpose delauney > triangulation solution, which is too bad -- it would be nice to have > that in MPL. > > -Chris > > > Chris: I've now added a griddata function to matplotlib.mlab that uses Robert Kern's scikit.delaunay code (which is now included in matplotlib as matplotlib.delaunay). The more bulletproof natgrid code, with the dubious license, is included as a toolkit (mpl_toolkits.natgrid), which griddata is configured to automatically use if installed. -Jeff
Klaus Zimmermann wrote: > Eric Firing schrieb: >> John Hunter wrote: >>> On Mon, Jul 21, 2008 at 3:12 AM, Klaus Zimmermann >>> <kla...@fm...> wrote: >>>> Hello *, >>>> >>>> right now the NonUniformImage class in image.py uses numpy's asarray >>>> method. All similar classes instead use numpy.ma.asarray, thus allowing >>>> for masked images. > [...] >> Masked arrays are handled automatically as needed by the >> ScalarMappable.to_rgba() method. >> >> What we really wanted, and the change I made throughout image.py, is >> to keep masked input as masked, and to ensure that anything else is a >> plain ndarray. This is now committed. > I just checked and I think your changes solve my problem well, obviously > without introducing the potential problems you mentioned above. Thanks! > I was just confused by the different semantics: > AxesImage : does masks, NxM array expects N, M dimensions. > NonUniformImage : didn't do masks, NxM array expects N, M dimensions. > PcolorImage: does masks, NxM array expects N+1, M+1 dimensions. Note that Image also handles PIL arrays, and all three handle NxMx3 and NxMx4 rbg and rgba arrays, in place of color mapping. > > Though I think the mask thingie in the NonUniformImage was simply a bug > and I understand why PcolorImage is the way it is, it still stumped me > at first sight. Also I find it difficult to understand the difference > Pcolor and NonUniform since NonUniform does pseudo colors just as well? > However if you feel this is just a lack of RTFM on my part please feel > free to ignore. NonUniformImage came first, and sat around for a long time without getting an Axes or pylab interface. Exactly what it should do always seemed a bit ambiguous to me; when the "pixels" are not uniformly spaced, where should the boundaries be? The only thing that makes sense to me in this case is to explicitly provide the boundaries, which is what pcolor does. But the original pcolor was slow, so I modified the NonUniformImage extension and python code to handle explicit boundaries to make a faster pcolor. I ended up with pcolorfast, which uses image code if the grid is uniform, PcolorImage code if it is rectangular but not uniform, and quadmesh if it is not even rectangular. In all of this, since I had never used NonUniformImage, and had no idea who was using it for what, I simply left it alone. It could be reimplemented as a wrapper around PcolorImage, thereby reusing rather than duplicating some code, but this is at best low priority. > >> I considered using np.asanyarray(A) but rejected it because it could >> fail for matrix input if any code is expecting iteration or >> single-indexing to return a 1-D array. > Makes sense. But perhaps we should refactor that check into a (module) > function of its own, as to avoid recundancy? I can do that if you want, > or if you prefer a classmethod in AxesImage? > I think the check is so short and simple that for now it is best to leave it as-is; maybe later there will be an attempt to factor some argument handling like this out from mpl as a whole, not just the image classes. > >> We lack examples to test masking of various types of input in the >> various types of image, though. Maybe I will add that later. Klaus, >> if you have any nice, small examples you would like to add to the mpl >> examples directory, that illustrate features or use cases that are not >> exercised in any present examples, please submit them. > Will do. > Good, thank you. Eric > Cheers, > Klaus
Hi, Has anyone ever thought about creating a TextCollection class? The purpose would be similar to the other collections, to group a bunch of text objects with similar properties. This probably couldn't inherit from Collection as of now though, since Collection assumes things like edgecolor and facecolor. The bigger question to me is, could the backends make use of this to any improvement? Or would this simply serve as an API to eliminate having to loop yourself (which would pretty much make this useless). My own personal use case is (once again) in meteorology, where we do station plots. This involves printing the actual value of observed variables relative to the location of the station. This isn't hard to do right now (especially since I have offset_copy back, thanks Mike!). I just wasn't sure if the batch functionality of a Collection might serve some purpose to the users at large. Thoughts? Ryan -- Ryan May Graduate Research Assistant School of Meteorology University of Oklahoma
Michael Droettboom wrote: > Ryan May wrote: >> 5) I added an empty circle marker for low wind speeds (vector >> magnitudes). Accomplishing having the unfilled circle while having >> the barbs filled involved a bit of a "elegant hack". Using the set of >> vertices that draws the CirclePolygon, I add an additional copy of >> these vertices, basically drawing the circle back the other way. This >> is basically tricking the drawing algorithm into drawing a really thin >> annulus with a very small gap, but it works perfectly as far as I can >> tell. It's also somewhat consistent with the way the lines on the >> barb are drawn. It is *far* simpler than any other solution, which >> would have required somehow mapping a color to each polygon *before* >> calling >> draw_path_collection(). None of the backends I test had a problem, >> including PS, PDF, and SVG (tested with Evince, Firefox, and Acroread). > Having replied before reading all my e-mail, I see you arrived at a > similar solution to the one I suggested. Great to hear that it worked. I'm just glad to know that it's an accepted hack. :) Ryan -- Ryan May Graduate Research Assistant School of Meteorology University of Oklahoma
Eric Firing wrote: > Eric Firing wrote: >> Ryan May wrote: >>> Hi, >>> >>> As promised, here's a short patch to add get_offsets() and >>> set_offsets() to the Collections() base class. I tried to make it do >>> the right thing with regard to _offsets vs. _uniform_offsets, >>> depending on whether _uniform_offsets is None. I also had tried to >>> make __init__ use set_offsets, but that proved to be too much of a >>> hassle for too little code reuse. >>> >>> Comments? >> >> I have applied this patch along with entries in API_CHANGES and >> CHANGELOG. It looks correct in the context of the present code, and >> since it is completely new functionality it can't hurt. > > Rats! I see I fouled up the commit message. I don't know how I came up > with "Ryan Kraus"! Sorry... I should have just left the commit to you. Don't worry about it. It's not a big deal, even though I am only doing open source for the credit, and the fame, fortune and women.....I still get those right? Ryan "Kraus" May -- Ryan May Graduate Research Assistant School of Meteorology University of Oklahoma
John, It seems like a slightly different design and some refactoring of the code would help with this (of course that's WAY easier to say than it is to do). I'm thinking of something like this: Public API layer: Very thin (i.e. minimum amount of code). The goal of this layer might be to transform various input types into a fixed set of data. This would include: - units to floats - various input data representations to a standard format. Like converting python lists to numpy arrays. - various input options to a standard format. Like the various marker, color, style input options into some standard representation. Private plotting layer: Fixed interface - doesn't support all the various options (lists, arrays, units, color codes, etc). Does the actual work of plotting. The public layer would just do conversions and then pass through to the private layer. Any code in the public layer would have to concern itself with possible different types (numpy vs lists, units vs floats, color names vs rgb). Any code written in the private layer could be assured of having some specific input types and would be much easier to write. It seems to have happened more than once that when some new chunk of code is added, there are a number of bugs that appear because that code is written with a single data type in mind (no units, no numpy arrays, etc). Having a clear layer where data types can be assured would help w/ that. Of course this would be a lot work and would require refactoring axis and maybe some of the collections. In theory it should be possible, but only you guys can decide if it's actually worthwhile or not. Ted > -----Original Message----- > From: mat...@li... > [mailto:mat...@li...] On Behalf Of > John Hunter > Sent: Monday, July 21, 2008 3:40 PM > To: Michael Droettboom > Cc: matplotlib development list; Eric Firing > Subject: Re: [matplotlib-devel] units support > > On Mon, Jul 21, 2008 at 5:25 PM, Michael Droettboom <md...@st...> > wrote: > > I'll second being confused at times. In the transformation > conversion, it > > was something I didn't know too much about up front, so it's quite > possible > > that I broke some things in that regard. (I know of some already, > but those > > were fixed shortly after things were merged into the trunk around > 0.98.0). > > All that is to say that you may want to cross-reference against > 0.91.x if > > things look fishy. But I think you're right, we need some sort of > "best > > practices" guidance for supporting units, and then hopefully just > track down > > all the places where "such-and-such" should be happening and isn't. > > I'll work on putting together a document for the developer's guide -- > how to support units in a plotting function and artist. Supporting > the units interface everywhere would definitely add to the memory > footprint of mpl and would slow it down, in addition to increasing the > coding burden. On the other hand, they are quite useful in some > cases, most notably for me working with native datetimes and hopefully > down the road a numpy datetime extension. Once I get the guide > written and provide some clarity on the interface, we can decide in > which functions it makes the most sense from a performance usability > perspective and clean those first. Perhaps there are additional > abstractions that can ease the coding burden, eg encapsulating all the > mask/unit/what-have-you logic into a single entity such as a location > array which supports a minimum set of operations. > > JDH > > JDH > > ----------------------------------------------------------------------- > -- > This SF.Net email is sponsored by the Moblin Your Move Developer's > challenge > Build the coolest Linux based applications with Moblin SDK & win great > prizes > Grand prize is a trip for two to an Open Source event anywhere in the > world > http://moblin-contest.org/redirect.php?banner_id=100&url=/ > _______________________________________________ > Matplotlib-devel mailing list > Mat...@li... > https://lists.sourceforge.net/lists/listinfo/matplotlib-devel > Checked by AVG - http://www.avg.com > Version: 8.0.138 / Virus Database: 270.5.3/1564 - Release Date: > 7/21/2008 6:42 AM
The solution is sufficiently obscure, that I decided to just re-introduce offset_copy (r5804). It appears to work as before, and the example works without changes, though let me know if you run into any snags. Cheers, Mike Michael Droettboom wrote: > I'll update the example. You may also find ScaledTranlation useful for > what you're doing. It will allow you to avoid hardcoding the dpi. > > http://matplotlib.sourceforge.net/doc/html/devel/transformations.html#matplotlib.transforms.ScaledTranslation > > Cheers, > Mike > > Andrew Straw wrote: > >> Ryan May wrote: >> >> >>> Hi, >>> >>> I noticed that offset_copy() went away in the transforms rewrite and was >>> replaced with a trans + transfroms.Affine2D().translate(x,y). This >>> works fine for x,y in pixels. However, offset_copy would also let you >>> specify x,y in points. How can I get that to work with the new >>> transforms? More importantly, can I do it without knowing the dpi? >>> >>> >> Also, it looks like examples/pylab_examples/transoffset.py is broken... >> >> (I think more and more we need to automatically run the examples and >> compare against "known good" images in svn as a form of automated testing.) >> >> -Andrew >> >> ------------------------------------------------------------------------- >> This SF.Net email is sponsored by the Moblin Your Move Developer's challenge >> Build the coolest Linux based applications with Moblin SDK & win great prizes >> Grand prize is a trip for two to an Open Source event anywhere in the world >> http://moblin-contest.org/redirect.php?banner_id=100&url=/ >> _______________________________________________ >> Matplotlib-devel mailing list >> Mat...@li... >> https://lists.sourceforge.net/lists/listinfo/matplotlib-devel >> >> > > > ------------------------------------------------------------------------- > This SF.Net email is sponsored by the Moblin Your Move Developer's challenge > Build the coolest Linux based applications with Moblin SDK & win great prizes > Grand prize is a trip for two to an Open Source event anywhere in the world > http://moblin-contest.org/redirect.php?banner_id=100&url=/ > _______________________________________________ > Matplotlib-devel mailing list > Mat...@li... > https://lists.sourceforge.net/lists/listinfo/matplotlib-devel >
On Mon, Jul 21, 2008 at 5:25 PM, Michael Droettboom <md...@st...> wrote: > I'll second being confused at times. In the transformation conversion, it > was something I didn't know too much about up front, so it's quite possible > that I broke some things in that regard. (I know of some already, but those > were fixed shortly after things were merged into the trunk around 0.98.0). > All that is to say that you may want to cross-reference against 0.91.x if > things look fishy. But I think you're right, we need some sort of "best > practices" guidance for supporting units, and then hopefully just track down > all the places where "such-and-such" should be happening and isn't. I'll work on putting together a document for the developer's guide -- how to support units in a plotting function and artist. Supporting the units interface everywhere would definitely add to the memory footprint of mpl and would slow it down, in addition to increasing the coding burden. On the other hand, they are quite useful in some cases, most notably for me working with native datetimes and hopefully down the road a numpy datetime extension. Once I get the guide written and provide some clarity on the interface, we can decide in which functions it makes the most sense from a performance usability perspective and clean those first. Perhaps there are additional abstractions that can ease the coding burden, eg encapsulating all the mask/unit/what-have-you logic into a single entity such as a location array which supports a minimum set of operations. JDH JDH
I'll second being confused at times. In the transformation conversion, it was something I didn't know too much about up front, so it's quite possible that I broke some things in that regard. (I know of some already, but those were fixed shortly after things were merged into the trunk around 0.98.0). All that is to say that you may want to cross-reference against 0.91.x if things look fishy. But I think you're right, we need some sort of "best practices" guidance for supporting units, and then hopefully just track down all the places where "such-and-such" should be happening and isn't. Cheers, Mike Eric Firing wrote: > John, > > I am still struggling to understand exactly how units support works, and > what is needed to make it work everywhere that it should. I see that it > works in plot, for example; it is not even necessary to use plot_date. > It does not work in scatter, and at first I thought that was because of > delete_masked_points, so I changed the latter to make sure that an array > of datetime instances would be handled correctly. > > Do Collections need something like the recache strategy used by Line2D? > > I get very confused as to when and where the convert method needs to be > used to change from , e.g., datetime instances to floats. One of the > things that happens in scatter is that this conversion is not done > somewhere that it is needed, and then a calculation of a pad for the > view limits fails. > > I get the impression that the conversion is being delayed until the last > possible part of the code; it would seem simpler if instead conversion > were done very near the front of the code, so that for all subsequent > calculations one could rely on having plain numbers to deal with. Of > course, that would restrict the use to the higher (user-level) parts of > the API, which has some disadvantages. > > I know you have thought all this through very carefully, and come up > with an implementation that is relatively unobtrusive. Is there a > summary anywhere that would make it clear to me how to make scatter, for > example, work with dates? Or quiver, or windbarb? (This is motivated > not by any immediate personal use case but by a desire to see mpl "just > work" with minimal surprises and exceptions.) > > Eric > > > ------------------------------------------------------------------------- > This SF.Net email is sponsored by the Moblin Your Move Developer's challenge > Build the coolest Linux based applications with Moblin SDK & win great prizes > Grand prize is a trip for two to an Open Source event anywhere in the world > http://moblin-contest.org/redirect.php?banner_id=100&url=/ > _______________________________________________ > Matplotlib-devel mailing list > Mat...@li... > https://lists.sourceforge.net/lists/listinfo/matplotlib-devel >
I'll update the example. You may also find ScaledTranlation useful for what you're doing. It will allow you to avoid hardcoding the dpi. http://matplotlib.sourceforge.net/doc/html/devel/transformations.html#matplotlib.transforms.ScaledTranslation Cheers, Mike Andrew Straw wrote: > Ryan May wrote: > >> Hi, >> >> I noticed that offset_copy() went away in the transforms rewrite and was >> replaced with a trans + transfroms.Affine2D().translate(x,y). This >> works fine for x,y in pixels. However, offset_copy would also let you >> specify x,y in points. How can I get that to work with the new >> transforms? More importantly, can I do it without knowing the dpi? >> > > Also, it looks like examples/pylab_examples/transoffset.py is broken... > > (I think more and more we need to automatically run the examples and > compare against "known good" images in svn as a form of automated testing.) > > -Andrew > > ------------------------------------------------------------------------- > This SF.Net email is sponsored by the Moblin Your Move Developer's challenge > Build the coolest Linux based applications with Moblin SDK & win great prizes > Grand prize is a trip for two to an Open Source event anywhere in the world > http://moblin-contest.org/redirect.php?banner_id=100&url=/ > _______________________________________________ > Matplotlib-devel mailing list > Mat...@li... > https://lists.sourceforge.net/lists/listinfo/matplotlib-devel >
Ryan May wrote: > 5) I added an empty circle marker for low wind speeds (vector > magnitudes). Accomplishing having the unfilled circle while having > the barbs filled involved a bit of a "elegant hack". Using the set of > vertices that draws the CirclePolygon, I add an additional copy of > these vertices, basically drawing the circle back the other way. This > is basically tricking the drawing algorithm into drawing a really thin > annulus with a very small gap, but it works perfectly as far as I can > tell. It's also somewhat consistent with the way the lines on the > barb are drawn. It is *far* simpler than any other solution, which > would have required somehow mapping a color to each polygon *before* > calling > draw_path_collection(). None of the backends I test had a problem, > including PS, PDF, and SVG (tested with Evince, Firefox, and Acroread). Having replied before reading all my e-mail, I see you arrived at a similar solution to the one I suggested. Great to hear that it worked. Cheers, Mike
I don't know if this simplifies things (you're much deeper in the middle of doing what you need to do), but PolyCollection really is a path collection these days. (The name is really for historical reasons). And since paths can be compound, you could draw a hollow circle using an inner and an outer circle. See Path.unit_circle for a direct way to get a series of bezier curves to approximate a circle. Of course, if you've found a simpler way in the mean time, go for it! Cheers, Mike Eric Firing wrote: > Ryan May wrote: > >> Hi, >> >> In trying to add a symbol for an empty wind barb, I ran into problem. >> Traditionally, a smaller, non-filled circle is used for low wind speeds >> when doing a barb plot. I can draw the circle easily as a polygon, >> using CirclePolygon from patches, but unfortunately, the fill color for >> this polygon ends up being the same as the color of the flags on the >> barbs. Therefore, as currently implemented, the only way to have >> unfilled circles is to have unfilled flags. >> >> The only technical solution I can think of here is to have separate >> collections for the circles and the polygons. Unfortunately, I have no >> idea how to begin to do that within a class that inherits from >> collections. Another option would be to somehow manually set the fill >> colors on the circles after the collection is initialized. Anyone have >> suggestions on how to make this work, or maybe a better technical solution. >> > > Ryan, > > Yes, this would require some bigger changes. Instead of inheriting from > PolyCollection, it would have to contain two collections, or a > collection and a Line2D with markers. > > An alternative would be to override the PolyCollection.draw() method > with a near-copy, but with logic added right before the renderer call to > set the alpha (column 3) of the rgba array to zero for the circles. > > A better way might be to modify the original draw method to use > self.get_facecolors() instead of self._facecolors; then one could > override the get_facecolors method, which would require much less code. > > Eric > > >> Jeff, how aesthetically displeasing do you think it would be to have >> small (possibly colored) circles to represent the low winds instead of >> the traditional (somewhat larger) hollow circle? It would make it >> impossible to do sky cover in a traditional surface map, but maybe >> Z-ordering could be used to hack around it. >> >> Thoughts, >> >> Ryan >> >> > > > ------------------------------------------------------------------------- > This SF.Net email is sponsored by the Moblin Your Move Developer's challenge > Build the coolest Linux based applications with Moblin SDK & win great prizes > Grand prize is a trip for two to an Open Source event anywhere in the world > http://moblin-contest.org/redirect.php?banner_id=100&url=/ > _______________________________________________ > Matplotlib-devel mailing list > Mat...@li... > https://lists.sourceforge.net/lists/listinfo/matplotlib-devel >
Eric Firing schrieb: > John Hunter wrote: >> On Mon, Jul 21, 2008 at 3:12 AM, Klaus Zimmermann >> <kla...@fm...> wrote: >>> Hello *, >>> >>> right now the NonUniformImage class in image.py uses numpy's asarray >>> method. All similar classes instead use numpy.ma.asarray, thus allowing >>> for masked images. [...] > Masked arrays are handled automatically as needed by the > ScalarMappable.to_rgba() method. > > What we really wanted, and the change I made throughout image.py, is to > keep masked input as masked, and to ensure that anything else is a plain > ndarray. This is now committed. I just checked and I think your changes solve my problem well, obviously without introducing the potential problems you mentioned above. Thanks! I was just confused by the different semantics: AxesImage : does masks, NxM array expects N, M dimensions. NonUniformImage : didn't do masks, NxM array expects N, M dimensions. PcolorImage: does masks, NxM array expects N+1, M+1 dimensions. Though I think the mask thingie in the NonUniformImage was simply a bug and I understand why PcolorImage is the way it is, it still stumped me at first sight. Also I find it difficult to understand the difference Pcolor and NonUniform since NonUniform does pseudo colors just as well? However if you feel this is just a lack of RTFM on my part please feel free to ignore. > I considered using np.asanyarray(A) but rejected it because it could > fail for matrix input if any code is expecting iteration or > single-indexing to return a 1-D array. Makes sense. But perhaps we should refactor that check into a (module) function of its own, as to avoid recundancy? I can do that if you want, or if you prefer a classmethod in AxesImage? > We lack examples to test masking of various types of input in the > various types of image, though. Maybe I will add that later. Klaus, if > you have any nice, small examples you would like to add to the mpl > examples directory, that illustrate features or use cases that are not > exercised in any present examples, please submit them. Will do. Cheers, Klaus
Eric Firing wrote: > Ryan May wrote: >> Hi, >> >> As promised, here's a short patch to add get_offsets() and set_offsets() >> to the Collections() base class. I tried to make it do the right thing >> with regard to _offsets vs. _uniform_offsets, depending on whether >> _uniform_offsets is None. I also had tried to make __init__ use >> set_offsets, but that proved to be too much of a hassle for too little >> code reuse. >> >> Comments? > > I have applied this patch along with entries in API_CHANGES and > CHANGELOG. It looks correct in the context of the present code, and > since it is completely new functionality it can't hurt. Rats! I see I fouled up the commit message. I don't know how I came up with "Ryan Kraus"! Sorry... I should have just left the commit to you. Eric
John, I am still struggling to understand exactly how units support works, and what is needed to make it work everywhere that it should. I see that it works in plot, for example; it is not even necessary to use plot_date. It does not work in scatter, and at first I thought that was because of delete_masked_points, so I changed the latter to make sure that an array of datetime instances would be handled correctly. Do Collections need something like the recache strategy used by Line2D? I get very confused as to when and where the convert method needs to be used to change from , e.g., datetime instances to floats. One of the things that happens in scatter is that this conversion is not done somewhere that it is needed, and then a calculation of a pad for the view limits fails. I get the impression that the conversion is being delayed until the last possible part of the code; it would seem simpler if instead conversion were done very near the front of the code, so that for all subsequent calculations one could rely on having plain numbers to deal with. Of course, that would restrict the use to the higher (user-level) parts of the API, which has some disadvantages. I know you have thought all this through very carefully, and come up with an implementation that is relatively unobtrusive. Is there a summary anywhere that would make it clear to me how to make scatter, for example, work with dates? Or quiver, or windbarb? (This is motivated not by any immediate personal use case but by a desire to see mpl "just work" with minimal surprises and exceptions.) Eric
Andrew Straw wrote: >> may be the quickest and most general way to do it. I believe >> ~np.isfinite is both more general and significantly faster than np.isnan. > > Clever, but it won't work as-is. np.isfinite('b') returns a > NotImplementedType, and a default argument to scatter is c='b', which > gets passed to this function. Anyhow, I implemented your idea with a > check for NotImplementedType and some unit tests in r5791. > Andrew, I think there were at least two problems with the delete_masked_points function after you added the isfinite check, one of which was left over from my earlier implementation, and one new one. I ended up rewriting just about everything, including the unit tests and the docstring (which is not in rst--sorry, maybe I can fix that later). Note that the function is now in cbook. I hope the combination of the code, the docstring and the unit tests make the intended functionality clear, but it may all still be a bit confusing. In any case, I think the new version does what is needed for scatter, hexbin, and windbarb, and may turn out to be more generally useful. As a side note, I suspect the check for NotImplementedType is not robust; I can easily imagine isfinite being changed to raise an exception instead. Therefore I did not use that check. Eric
Christopher Barker wrote: > Jeff Whitaker wrote: >> I checked >> Shewchuk's web page and unfortunately his code comes with this license: > > ... > > How I wish people would just pick a known Open Source License -- it's > not like there are a shortage of them! Might it be worth a note to > Shewchuk asking him if we can put it in MPL? -- though it doesn't look > promising. Because he (or his institution's technology transfer department) wants to forbid commercial use without paying them money. He doesn't want Triangle to be open source. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco
Ryan May wrote: > Hi, > > As promised, here's a short patch to add get_offsets() and set_offsets() > to the Collections() base class. I tried to make it do the right thing > with regard to _offsets vs. _uniform_offsets, depending on whether > _uniform_offsets is None. I also had tried to make __init__ use > set_offsets, but that proved to be too much of a hassle for too little > code reuse. > > Comments? I have applied this patch along with entries in API_CHANGES and CHANGELOG. It looks correct in the context of the present code, and since it is completely new functionality it can't hurt. Overall, however, the handling of offsets is pretty confusing. I suspect this is my fault; it probably results from my wanting to make it easy to use the LineCollection for waterfall plots. I looked to see whether it could be simplified by using an appropriate offsetTransform, but this is at best not straightforward or obvious. Eric
John Hunter wrote: > On Mon, Jul 21, 2008 at 3:12 AM, Klaus Zimmermann > <kla...@fm...> wrote: >> Hello *, >> >> right now the NonUniformImage class in image.py uses numpy's asarray >> method. All similar classes instead use numpy.ma.asarray, thus allowing >> for masked images. >> I think this should be changed as in the attached patch. >> >> Otherwise thanks for matplotlib :) > > Eric, could you take a look at this? Although the patch is trivial, I > just want to make sure that the image extension code will do the right > thing if a masked array gets passed into it. I d not see any special > handling in _image.pcolor so am not sure what happens when a masked > array gets passed in. > > JDH John, Klaus, We already were using masked arrays for some image types even when we did not need to do so (and when it was inappropriate), in which case the mask was ignored. Masked arrays are handled automatically as needed by the ScalarMappable.to_rgba() method. What we really wanted, and the change I made throughout image.py, is to keep masked input as masked, and to ensure that anything else is a plain ndarray. This is now committed. I considered using np.asanyarray(A) but rejected it because it could fail for matrix input if any code is expecting iteration or single-indexing to return a 1-D array. We lack examples to test masking of various types of input in the various types of image, though. Maybe I will add that later. Klaus, if you have any nice, small examples you would like to add to the mpl examples directory, that illustrate features or use cases that are not exercised in any present examples, please submit them. Eric