You can subscribe to this list here.
2003 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
(1) |
Nov
(33) |
Dec
(20) |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2004 |
Jan
(7) |
Feb
(44) |
Mar
(51) |
Apr
(43) |
May
(43) |
Jun
(36) |
Jul
(61) |
Aug
(44) |
Sep
(25) |
Oct
(82) |
Nov
(97) |
Dec
(47) |
2005 |
Jan
(77) |
Feb
(143) |
Mar
(42) |
Apr
(31) |
May
(93) |
Jun
(93) |
Jul
(35) |
Aug
(78) |
Sep
(56) |
Oct
(44) |
Nov
(72) |
Dec
(75) |
2006 |
Jan
(116) |
Feb
(99) |
Mar
(181) |
Apr
(171) |
May
(112) |
Jun
(86) |
Jul
(91) |
Aug
(111) |
Sep
(77) |
Oct
(72) |
Nov
(57) |
Dec
(51) |
2007 |
Jan
(64) |
Feb
(116) |
Mar
(70) |
Apr
(74) |
May
(53) |
Jun
(40) |
Jul
(519) |
Aug
(151) |
Sep
(132) |
Oct
(74) |
Nov
(282) |
Dec
(190) |
2008 |
Jan
(141) |
Feb
(67) |
Mar
(69) |
Apr
(96) |
May
(227) |
Jun
(404) |
Jul
(399) |
Aug
(96) |
Sep
(120) |
Oct
(205) |
Nov
(126) |
Dec
(261) |
2009 |
Jan
(136) |
Feb
(136) |
Mar
(119) |
Apr
(124) |
May
(155) |
Jun
(98) |
Jul
(136) |
Aug
(292) |
Sep
(174) |
Oct
(126) |
Nov
(126) |
Dec
(79) |
2010 |
Jan
(109) |
Feb
(83) |
Mar
(139) |
Apr
(91) |
May
(79) |
Jun
(164) |
Jul
(184) |
Aug
(146) |
Sep
(163) |
Oct
(128) |
Nov
(70) |
Dec
(73) |
2011 |
Jan
(235) |
Feb
(165) |
Mar
(147) |
Apr
(86) |
May
(74) |
Jun
(118) |
Jul
(65) |
Aug
(75) |
Sep
(162) |
Oct
(94) |
Nov
(48) |
Dec
(44) |
2012 |
Jan
(49) |
Feb
(40) |
Mar
(88) |
Apr
(35) |
May
(52) |
Jun
(69) |
Jul
(90) |
Aug
(123) |
Sep
(112) |
Oct
(120) |
Nov
(105) |
Dec
(116) |
2013 |
Jan
(76) |
Feb
(26) |
Mar
(78) |
Apr
(43) |
May
(61) |
Jun
(53) |
Jul
(147) |
Aug
(85) |
Sep
(83) |
Oct
(122) |
Nov
(18) |
Dec
(27) |
2014 |
Jan
(58) |
Feb
(25) |
Mar
(49) |
Apr
(17) |
May
(29) |
Jun
(39) |
Jul
(53) |
Aug
(52) |
Sep
(35) |
Oct
(47) |
Nov
(110) |
Dec
(27) |
2015 |
Jan
(50) |
Feb
(93) |
Mar
(96) |
Apr
(30) |
May
(55) |
Jun
(83) |
Jul
(44) |
Aug
(8) |
Sep
(5) |
Oct
|
Nov
(1) |
Dec
(1) |
2016 |
Jan
|
Feb
|
Mar
(1) |
Apr
|
May
|
Jun
(2) |
Jul
|
Aug
(3) |
Sep
(1) |
Oct
(3) |
Nov
|
Dec
|
2017 |
Jan
|
Feb
(5) |
Mar
|
Apr
|
May
|
Jun
|
Jul
(3) |
Aug
|
Sep
(7) |
Oct
|
Nov
|
Dec
|
2018 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
(2) |
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
S | M | T | W | T | F | S |
---|---|---|---|---|---|---|
|
|
1
(22) |
2
(1) |
3
|
4
(2) |
5
|
6
(1) |
7
(14) |
8
(3) |
9
(4) |
10
|
11
(5) |
12
(1) |
13
(4) |
14
(1) |
15
(1) |
16
(8) |
17
(28) |
18
(48) |
19
(18) |
20
(19) |
21
(33) |
22
(11) |
23
(18) |
24
(29) |
25
(36) |
26
(18) |
27
(23) |
28
(19) |
29
(9) |
30
(7) |
31
(16) |
|
|
John, It seems like a slightly different design and some refactoring of the code would help with this (of course that's WAY easier to say than it is to do). I'm thinking of something like this: Public API layer: Very thin (i.e. minimum amount of code). The goal of this layer might be to transform various input types into a fixed set of data. This would include: - units to floats - various input data representations to a standard format. Like converting python lists to numpy arrays. - various input options to a standard format. Like the various marker, color, style input options into some standard representation. Private plotting layer: Fixed interface - doesn't support all the various options (lists, arrays, units, color codes, etc). Does the actual work of plotting. The public layer would just do conversions and then pass through to the private layer. Any code in the public layer would have to concern itself with possible different types (numpy vs lists, units vs floats, color names vs rgb). Any code written in the private layer could be assured of having some specific input types and would be much easier to write. It seems to have happened more than once that when some new chunk of code is added, there are a number of bugs that appear because that code is written with a single data type in mind (no units, no numpy arrays, etc). Having a clear layer where data types can be assured would help w/ that. Of course this would be a lot work and would require refactoring axis and maybe some of the collections. In theory it should be possible, but only you guys can decide if it's actually worthwhile or not. Ted > -----Original Message----- > From: mat...@li... > [mailto:mat...@li...] On Behalf Of > John Hunter > Sent: Monday, July 21, 2008 3:40 PM > To: Michael Droettboom > Cc: matplotlib development list; Eric Firing > Subject: Re: [matplotlib-devel] units support > > On Mon, Jul 21, 2008 at 5:25 PM, Michael Droettboom <md...@st...> > wrote: > > I'll second being confused at times. In the transformation > conversion, it > > was something I didn't know too much about up front, so it's quite > possible > > that I broke some things in that regard. (I know of some already, > but those > > were fixed shortly after things were merged into the trunk around > 0.98.0). > > All that is to say that you may want to cross-reference against > 0.91.x if > > things look fishy. But I think you're right, we need some sort of > "best > > practices" guidance for supporting units, and then hopefully just > track down > > all the places where "such-and-such" should be happening and isn't. > > I'll work on putting together a document for the developer's guide -- > how to support units in a plotting function and artist. Supporting > the units interface everywhere would definitely add to the memory > footprint of mpl and would slow it down, in addition to increasing the > coding burden. On the other hand, they are quite useful in some > cases, most notably for me working with native datetimes and hopefully > down the road a numpy datetime extension. Once I get the guide > written and provide some clarity on the interface, we can decide in > which functions it makes the most sense from a performance usability > perspective and clean those first. Perhaps there are additional > abstractions that can ease the coding burden, eg encapsulating all the > mask/unit/what-have-you logic into a single entity such as a location > array which supports a minimum set of operations. > > JDH > > JDH > > ----------------------------------------------------------------------- > -- > This SF.Net email is sponsored by the Moblin Your Move Developer's > challenge > Build the coolest Linux based applications with Moblin SDK & win great > prizes > Grand prize is a trip for two to an Open Source event anywhere in the > world > http://moblin-contest.org/redirect.php?banner_id=100&url=/ > _______________________________________________ > Matplotlib-devel mailing list > Mat...@li... > https://lists.sourceforge.net/lists/listinfo/matplotlib-devel > Checked by AVG - http://www.avg.com > Version: 8.0.138 / Virus Database: 270.5.3/1564 - Release Date: > 7/21/2008 6:42 AM
The solution is sufficiently obscure, that I decided to just re-introduce offset_copy (r5804). It appears to work as before, and the example works without changes, though let me know if you run into any snags. Cheers, Mike Michael Droettboom wrote: > I'll update the example. You may also find ScaledTranlation useful for > what you're doing. It will allow you to avoid hardcoding the dpi. > > http://matplotlib.sourceforge.net/doc/html/devel/transformations.html#matplotlib.transforms.ScaledTranslation > > Cheers, > Mike > > Andrew Straw wrote: > >> Ryan May wrote: >> >> >>> Hi, >>> >>> I noticed that offset_copy() went away in the transforms rewrite and was >>> replaced with a trans + transfroms.Affine2D().translate(x,y). This >>> works fine for x,y in pixels. However, offset_copy would also let you >>> specify x,y in points. How can I get that to work with the new >>> transforms? More importantly, can I do it without knowing the dpi? >>> >>> >> Also, it looks like examples/pylab_examples/transoffset.py is broken... >> >> (I think more and more we need to automatically run the examples and >> compare against "known good" images in svn as a form of automated testing.) >> >> -Andrew >> >> ------------------------------------------------------------------------- >> This SF.Net email is sponsored by the Moblin Your Move Developer's challenge >> Build the coolest Linux based applications with Moblin SDK & win great prizes >> Grand prize is a trip for two to an Open Source event anywhere in the world >> http://moblin-contest.org/redirect.php?banner_id=100&url=/ >> _______________________________________________ >> Matplotlib-devel mailing list >> Mat...@li... >> https://lists.sourceforge.net/lists/listinfo/matplotlib-devel >> >> > > > ------------------------------------------------------------------------- > This SF.Net email is sponsored by the Moblin Your Move Developer's challenge > Build the coolest Linux based applications with Moblin SDK & win great prizes > Grand prize is a trip for two to an Open Source event anywhere in the world > http://moblin-contest.org/redirect.php?banner_id=100&url=/ > _______________________________________________ > Matplotlib-devel mailing list > Mat...@li... > https://lists.sourceforge.net/lists/listinfo/matplotlib-devel >
On Mon, Jul 21, 2008 at 5:25 PM, Michael Droettboom <md...@st...> wrote: > I'll second being confused at times. In the transformation conversion, it > was something I didn't know too much about up front, so it's quite possible > that I broke some things in that regard. (I know of some already, but those > were fixed shortly after things were merged into the trunk around 0.98.0). > All that is to say that you may want to cross-reference against 0.91.x if > things look fishy. But I think you're right, we need some sort of "best > practices" guidance for supporting units, and then hopefully just track down > all the places where "such-and-such" should be happening and isn't. I'll work on putting together a document for the developer's guide -- how to support units in a plotting function and artist. Supporting the units interface everywhere would definitely add to the memory footprint of mpl and would slow it down, in addition to increasing the coding burden. On the other hand, they are quite useful in some cases, most notably for me working with native datetimes and hopefully down the road a numpy datetime extension. Once I get the guide written and provide some clarity on the interface, we can decide in which functions it makes the most sense from a performance usability perspective and clean those first. Perhaps there are additional abstractions that can ease the coding burden, eg encapsulating all the mask/unit/what-have-you logic into a single entity such as a location array which supports a minimum set of operations. JDH JDH
I'll second being confused at times. In the transformation conversion, it was something I didn't know too much about up front, so it's quite possible that I broke some things in that regard. (I know of some already, but those were fixed shortly after things were merged into the trunk around 0.98.0). All that is to say that you may want to cross-reference against 0.91.x if things look fishy. But I think you're right, we need some sort of "best practices" guidance for supporting units, and then hopefully just track down all the places where "such-and-such" should be happening and isn't. Cheers, Mike Eric Firing wrote: > John, > > I am still struggling to understand exactly how units support works, and > what is needed to make it work everywhere that it should. I see that it > works in plot, for example; it is not even necessary to use plot_date. > It does not work in scatter, and at first I thought that was because of > delete_masked_points, so I changed the latter to make sure that an array > of datetime instances would be handled correctly. > > Do Collections need something like the recache strategy used by Line2D? > > I get very confused as to when and where the convert method needs to be > used to change from , e.g., datetime instances to floats. One of the > things that happens in scatter is that this conversion is not done > somewhere that it is needed, and then a calculation of a pad for the > view limits fails. > > I get the impression that the conversion is being delayed until the last > possible part of the code; it would seem simpler if instead conversion > were done very near the front of the code, so that for all subsequent > calculations one could rely on having plain numbers to deal with. Of > course, that would restrict the use to the higher (user-level) parts of > the API, which has some disadvantages. > > I know you have thought all this through very carefully, and come up > with an implementation that is relatively unobtrusive. Is there a > summary anywhere that would make it clear to me how to make scatter, for > example, work with dates? Or quiver, or windbarb? (This is motivated > not by any immediate personal use case but by a desire to see mpl "just > work" with minimal surprises and exceptions.) > > Eric > > > ------------------------------------------------------------------------- > This SF.Net email is sponsored by the Moblin Your Move Developer's challenge > Build the coolest Linux based applications with Moblin SDK & win great prizes > Grand prize is a trip for two to an Open Source event anywhere in the world > http://moblin-contest.org/redirect.php?banner_id=100&url=/ > _______________________________________________ > Matplotlib-devel mailing list > Mat...@li... > https://lists.sourceforge.net/lists/listinfo/matplotlib-devel >
I'll update the example. You may also find ScaledTranlation useful for what you're doing. It will allow you to avoid hardcoding the dpi. http://matplotlib.sourceforge.net/doc/html/devel/transformations.html#matplotlib.transforms.ScaledTranslation Cheers, Mike Andrew Straw wrote: > Ryan May wrote: > >> Hi, >> >> I noticed that offset_copy() went away in the transforms rewrite and was >> replaced with a trans + transfroms.Affine2D().translate(x,y). This >> works fine for x,y in pixels. However, offset_copy would also let you >> specify x,y in points. How can I get that to work with the new >> transforms? More importantly, can I do it without knowing the dpi? >> > > Also, it looks like examples/pylab_examples/transoffset.py is broken... > > (I think more and more we need to automatically run the examples and > compare against "known good" images in svn as a form of automated testing.) > > -Andrew > > ------------------------------------------------------------------------- > This SF.Net email is sponsored by the Moblin Your Move Developer's challenge > Build the coolest Linux based applications with Moblin SDK & win great prizes > Grand prize is a trip for two to an Open Source event anywhere in the world > http://moblin-contest.org/redirect.php?banner_id=100&url=/ > _______________________________________________ > Matplotlib-devel mailing list > Mat...@li... > https://lists.sourceforge.net/lists/listinfo/matplotlib-devel >
Ryan May wrote: > 5) I added an empty circle marker for low wind speeds (vector > magnitudes). Accomplishing having the unfilled circle while having > the barbs filled involved a bit of a "elegant hack". Using the set of > vertices that draws the CirclePolygon, I add an additional copy of > these vertices, basically drawing the circle back the other way. This > is basically tricking the drawing algorithm into drawing a really thin > annulus with a very small gap, but it works perfectly as far as I can > tell. It's also somewhat consistent with the way the lines on the > barb are drawn. It is *far* simpler than any other solution, which > would have required somehow mapping a color to each polygon *before* > calling > draw_path_collection(). None of the backends I test had a problem, > including PS, PDF, and SVG (tested with Evince, Firefox, and Acroread). Having replied before reading all my e-mail, I see you arrived at a similar solution to the one I suggested. Great to hear that it worked. Cheers, Mike
I don't know if this simplifies things (you're much deeper in the middle of doing what you need to do), but PolyCollection really is a path collection these days. (The name is really for historical reasons). And since paths can be compound, you could draw a hollow circle using an inner and an outer circle. See Path.unit_circle for a direct way to get a series of bezier curves to approximate a circle. Of course, if you've found a simpler way in the mean time, go for it! Cheers, Mike Eric Firing wrote: > Ryan May wrote: > >> Hi, >> >> In trying to add a symbol for an empty wind barb, I ran into problem. >> Traditionally, a smaller, non-filled circle is used for low wind speeds >> when doing a barb plot. I can draw the circle easily as a polygon, >> using CirclePolygon from patches, but unfortunately, the fill color for >> this polygon ends up being the same as the color of the flags on the >> barbs. Therefore, as currently implemented, the only way to have >> unfilled circles is to have unfilled flags. >> >> The only technical solution I can think of here is to have separate >> collections for the circles and the polygons. Unfortunately, I have no >> idea how to begin to do that within a class that inherits from >> collections. Another option would be to somehow manually set the fill >> colors on the circles after the collection is initialized. Anyone have >> suggestions on how to make this work, or maybe a better technical solution. >> > > Ryan, > > Yes, this would require some bigger changes. Instead of inheriting from > PolyCollection, it would have to contain two collections, or a > collection and a Line2D with markers. > > An alternative would be to override the PolyCollection.draw() method > with a near-copy, but with logic added right before the renderer call to > set the alpha (column 3) of the rgba array to zero for the circles. > > A better way might be to modify the original draw method to use > self.get_facecolors() instead of self._facecolors; then one could > override the get_facecolors method, which would require much less code. > > Eric > > >> Jeff, how aesthetically displeasing do you think it would be to have >> small (possibly colored) circles to represent the low winds instead of >> the traditional (somewhat larger) hollow circle? It would make it >> impossible to do sky cover in a traditional surface map, but maybe >> Z-ordering could be used to hack around it. >> >> Thoughts, >> >> Ryan >> >> > > > ------------------------------------------------------------------------- > This SF.Net email is sponsored by the Moblin Your Move Developer's challenge > Build the coolest Linux based applications with Moblin SDK & win great prizes > Grand prize is a trip for two to an Open Source event anywhere in the world > http://moblin-contest.org/redirect.php?banner_id=100&url=/ > _______________________________________________ > Matplotlib-devel mailing list > Mat...@li... > https://lists.sourceforge.net/lists/listinfo/matplotlib-devel >
Eric Firing schrieb: > John Hunter wrote: >> On Mon, Jul 21, 2008 at 3:12 AM, Klaus Zimmermann >> <kla...@fm...> wrote: >>> Hello *, >>> >>> right now the NonUniformImage class in image.py uses numpy's asarray >>> method. All similar classes instead use numpy.ma.asarray, thus allowing >>> for masked images. [...] > Masked arrays are handled automatically as needed by the > ScalarMappable.to_rgba() method. > > What we really wanted, and the change I made throughout image.py, is to > keep masked input as masked, and to ensure that anything else is a plain > ndarray. This is now committed. I just checked and I think your changes solve my problem well, obviously without introducing the potential problems you mentioned above. Thanks! I was just confused by the different semantics: AxesImage : does masks, NxM array expects N, M dimensions. NonUniformImage : didn't do masks, NxM array expects N, M dimensions. PcolorImage: does masks, NxM array expects N+1, M+1 dimensions. Though I think the mask thingie in the NonUniformImage was simply a bug and I understand why PcolorImage is the way it is, it still stumped me at first sight. Also I find it difficult to understand the difference Pcolor and NonUniform since NonUniform does pseudo colors just as well? However if you feel this is just a lack of RTFM on my part please feel free to ignore. > I considered using np.asanyarray(A) but rejected it because it could > fail for matrix input if any code is expecting iteration or > single-indexing to return a 1-D array. Makes sense. But perhaps we should refactor that check into a (module) function of its own, as to avoid recundancy? I can do that if you want, or if you prefer a classmethod in AxesImage? > We lack examples to test masking of various types of input in the > various types of image, though. Maybe I will add that later. Klaus, if > you have any nice, small examples you would like to add to the mpl > examples directory, that illustrate features or use cases that are not > exercised in any present examples, please submit them. Will do. Cheers, Klaus
Eric Firing wrote: > Ryan May wrote: >> Hi, >> >> As promised, here's a short patch to add get_offsets() and set_offsets() >> to the Collections() base class. I tried to make it do the right thing >> with regard to _offsets vs. _uniform_offsets, depending on whether >> _uniform_offsets is None. I also had tried to make __init__ use >> set_offsets, but that proved to be too much of a hassle for too little >> code reuse. >> >> Comments? > > I have applied this patch along with entries in API_CHANGES and > CHANGELOG. It looks correct in the context of the present code, and > since it is completely new functionality it can't hurt. Rats! I see I fouled up the commit message. I don't know how I came up with "Ryan Kraus"! Sorry... I should have just left the commit to you. Eric
John, I am still struggling to understand exactly how units support works, and what is needed to make it work everywhere that it should. I see that it works in plot, for example; it is not even necessary to use plot_date. It does not work in scatter, and at first I thought that was because of delete_masked_points, so I changed the latter to make sure that an array of datetime instances would be handled correctly. Do Collections need something like the recache strategy used by Line2D? I get very confused as to when and where the convert method needs to be used to change from , e.g., datetime instances to floats. One of the things that happens in scatter is that this conversion is not done somewhere that it is needed, and then a calculation of a pad for the view limits fails. I get the impression that the conversion is being delayed until the last possible part of the code; it would seem simpler if instead conversion were done very near the front of the code, so that for all subsequent calculations one could rely on having plain numbers to deal with. Of course, that would restrict the use to the higher (user-level) parts of the API, which has some disadvantages. I know you have thought all this through very carefully, and come up with an implementation that is relatively unobtrusive. Is there a summary anywhere that would make it clear to me how to make scatter, for example, work with dates? Or quiver, or windbarb? (This is motivated not by any immediate personal use case but by a desire to see mpl "just work" with minimal surprises and exceptions.) Eric
Andrew Straw wrote: >> may be the quickest and most general way to do it. I believe >> ~np.isfinite is both more general and significantly faster than np.isnan. > > Clever, but it won't work as-is. np.isfinite('b') returns a > NotImplementedType, and a default argument to scatter is c='b', which > gets passed to this function. Anyhow, I implemented your idea with a > check for NotImplementedType and some unit tests in r5791. > Andrew, I think there were at least two problems with the delete_masked_points function after you added the isfinite check, one of which was left over from my earlier implementation, and one new one. I ended up rewriting just about everything, including the unit tests and the docstring (which is not in rst--sorry, maybe I can fix that later). Note that the function is now in cbook. I hope the combination of the code, the docstring and the unit tests make the intended functionality clear, but it may all still be a bit confusing. In any case, I think the new version does what is needed for scatter, hexbin, and windbarb, and may turn out to be more generally useful. As a side note, I suspect the check for NotImplementedType is not robust; I can easily imagine isfinite being changed to raise an exception instead. Therefore I did not use that check. Eric
Christopher Barker wrote: > Jeff Whitaker wrote: >> I checked >> Shewchuk's web page and unfortunately his code comes with this license: > > ... > > How I wish people would just pick a known Open Source License -- it's > not like there are a shortage of them! Might it be worth a note to > Shewchuk asking him if we can put it in MPL? -- though it doesn't look > promising. Because he (or his institution's technology transfer department) wants to forbid commercial use without paying them money. He doesn't want Triangle to be open source. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco
Ryan May wrote: > Hi, > > As promised, here's a short patch to add get_offsets() and set_offsets() > to the Collections() base class. I tried to make it do the right thing > with regard to _offsets vs. _uniform_offsets, depending on whether > _uniform_offsets is None. I also had tried to make __init__ use > set_offsets, but that proved to be too much of a hassle for too little > code reuse. > > Comments? I have applied this patch along with entries in API_CHANGES and CHANGELOG. It looks correct in the context of the present code, and since it is completely new functionality it can't hurt. Overall, however, the handling of offsets is pretty confusing. I suspect this is my fault; it probably results from my wanting to make it easy to use the LineCollection for waterfall plots. I looked to see whether it could be simplified by using an appropriate offsetTransform, but this is at best not straightforward or obvious. Eric
John Hunter wrote: > On Mon, Jul 21, 2008 at 3:12 AM, Klaus Zimmermann > <kla...@fm...> wrote: >> Hello *, >> >> right now the NonUniformImage class in image.py uses numpy's asarray >> method. All similar classes instead use numpy.ma.asarray, thus allowing >> for masked images. >> I think this should be changed as in the attached patch. >> >> Otherwise thanks for matplotlib :) > > Eric, could you take a look at this? Although the patch is trivial, I > just want to make sure that the image extension code will do the right > thing if a masked array gets passed into it. I d not see any special > handling in _image.pcolor so am not sure what happens when a masked > array gets passed in. > > JDH John, Klaus, We already were using masked arrays for some image types even when we did not need to do so (and when it was inappropriate), in which case the mask was ignored. Masked arrays are handled automatically as needed by the ScalarMappable.to_rgba() method. What we really wanted, and the change I made throughout image.py, is to keep masked input as masked, and to ensure that anything else is a plain ndarray. This is now committed. I considered using np.asanyarray(A) but rejected it because it could fail for matrix input if any code is expecting iteration or single-indexing to return a 1-D array. We lack examples to test masking of various types of input in the various types of image, though. Maybe I will add that later. Klaus, if you have any nice, small examples you would like to add to the mpl examples directory, that illustrate features or use cases that are not exercised in any present examples, please submit them. Eric
arrg! When am I going to learn not to click "send" until after I've read the entire thread! Jeff Whitaker wrote: > John: I just contacted NCAR again, and it seems that they have > relicensed the software under an OSI-based license similar to the > University of Illinois/NCSA: ... > What do you think? If it's OK I say we use the natgrid package in > matplotlib, since it's more bulletproof than the scikits package (it > passes Robert's degenerate triangulation test, and has been pounded on > by user of NCAR graphics since the 1980's). that would be nice, but while it is a good solution to the re-gridding problem, it doesn't appear to provide a general purpose delauney triangulation solution, which is too bad -- it would be nice to have that in MPL. -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chr...@no...
On Mon, Jul 21, 2008 at 1:06 PM, Eric Firing <ef...@ha...> wrote: > John Hunter wrote: >> >> On Mon, Jul 21, 2008 at 7:51 AM, David Kaplan <Dav...@ir...> wrote: >> >>> 2) Can someone explain to me why is_string_like in the cbook doesn't >>> just do isinstance(obj,str)? Is there anything "string like" that won't >>> be caught be this isinstance call? >> >> In [65]: s = u'jdh' >> >> In [66]: isinstance(s, str) >> Out[66]: False >> >> In [67]: isinstance(s, unicode) >> Out[67]: True >> >> So we could check for str or unicode, but a user may be using a custom >> string like class from some c++ extension code that is part of a large >> in house API. The point is that we don't care if it *is* a string, we >> just want it to act like a string >> >> http://en.wikipedia.org/wiki/Duck_typing > > Sometimes we have > > if is_string_like(s) and s == 'some string': do_something() > > I think that as long as we know s is not None (which often is something that > is being checked first) then it would be simpler, faster, and more readable > to use > > if str(s) == 'some string': do_something() > > John, do you see any problems with this? I think str(s) is guaranteed to > return a string--that is, not to fail--for any python object, correct? My guess is that your code will work, but I am disinclined to remove a check that is working and was probably added to satisfy some corner case we are forgetting about. str can fail, BTW: In [35]: class Evil: ....: def __str__(self): ....: raise NotImplementedError ....: In [36]: evil = Evil() In [37]: print str(evil) ------------------------------------------------------------ Traceback (most recent call last): File "<ipython console>", line 1, in ? File "<ipython console>", line 3, in __str__ NotImplementedError
Jeff Whitaker wrote: >> Basically, Fortune's sweepline algorithm for Delaunay triangulation >> simply needs to be replaced with an algorithm that can be formulated >> using Jonathan Shewchuck's robust predicates: >> >> http://www.cs.cmu.edu/~quake/robust.html great idea. > I checked > Shewchuk's web page and unfortunately his code comes with this license: ... How I wish people would just pick a known Open Source License -- it's not like there are a shortage of them! Might it be worth a note to Shewchuk asking him if we can put it in MPL? -- though it doesn't look promising. Fortunately, his Robust Predicate code is, I think, in the public domain, or a more flexible license anyway. We've got some robust code in-house that does it all in integers, though that does limit your options. It's based on Knuth's work in: Axioms and Hulls by Donald E. Knuth (Heidelberg: Springer-Verlag, 1992), ix+109pp. (Lecture Notes in Computer Science, no. 606.) ISBN 3-540-55611-7 http://www-cs-faculty.stanford.edu/~knuth/aah.html Anyone know of any other code based on that work? Nice to see this moving forward, though -- thanks, everyone! -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chr...@no...
Ryan May wrote: > John Hunter wrote: >> On Sat, Jul 19, 2008 at 11:09 PM, Ryan May <rm...@gm...> wrote: >> >>> The only issue I've seen is that scaling with PS is way too big. I've >>> attached ps and pdf files from the same run to show the problem. >> The only thing I can think of is since you are using a identity >> transform and drawing in pixels, you are seeing the effect of the >> savefig dpi in pdf and png but not in ps, which hardcodes the dpi to >> 72. If this is correct, you should not see the effect if you pass >> dpi=72 to savefig when saving the PS and PDF. You will probably want >> to modify the patch to make sure your barbs scales are dpi >> independent. I have only looked briefly at the barbs code so I could >> be missing something obvious, but this is the first thing that comes >> to mind. > > I don't think an IdentityTransform() implies drawing in pixels. My > length 9 barb is a lot longer than 9 pixels. It looks more like ~50. I > really was looking for (and thought I found) a way to draw in a > resolution independant fashion. Axes coordinates are close, but > unfortunately, as you stretch a figure, this can distort things. > >>> It should apply fine to SVN, but there are commented lines that should be >>> switched with the ones there when set/get_offsets() are added to >>> Collections. >>> >>> Comments and Suggestions? >>> >>> How do you guys manage committing only parts of your working copy, >>> especially when you want to commit part of a file? I figure there's got to >>> be a better way than multiple SVN checkouts and manually editing diffs. >> svn should do this automagically; it only commits the diff from your >> current working version and the svn HEAD. >> >>> svn up >> # do some work >>> svn diff # these are the changes that will be committed, just preview them >>> svn commit -m 'my log message' # the diff will be committed > > I'm more interested how you guys handle having multiple lines of > development going on in a single working copy, like working on multiple > separate additions to axes.py. Trying to commit only a subset of those > changes is difficult as far as I can tell. Or is the advice "don't do > that" and use separate working copies? What if I'm working on something > big and then have a small bug fix to do on the same file? Additional > working copies wouldn't be a big deal, but it seems to take forever to > do a fresh checkout from sourceforge. > > Ryan > I think you could have a master checkout, and then use a local rsync to make copies of it for hacking around on different parts. (This is the sort of thing that is made very fast and easy with mercurial, but the mercurial-svn interface mechanisms seem to be a bit clumsy, unfortunately. Mike recently mentioned doing this sort of thing with git. I haven't looked into git much; it has the reputation of being rather hard to understand, and I have been happily using mercurial for my local work for quite a long time, so I am not eager to start getting confused by an alternative.) Eric
John Hunter wrote: > On Mon, Jul 21, 2008 at 7:51 AM, David Kaplan <Dav...@ir...> wrote: > >> 2) Can someone explain to me why is_string_like in the cbook doesn't >> just do isinstance(obj,str)? Is there anything "string like" that won't >> be caught be this isinstance call? > > In [65]: s = u'jdh' > > In [66]: isinstance(s, str) > Out[66]: False > > In [67]: isinstance(s, unicode) > Out[67]: True > > So we could check for str or unicode, but a user may be using a custom > string like class from some c++ extension code that is part of a large > in house API. The point is that we don't care if it *is* a string, we > just want it to act like a string > > http://en.wikipedia.org/wiki/Duck_typing Sometimes we have if is_string_like(s) and s == 'some string': do_something() I think that as long as we know s is not None (which often is something that is being checked first) then it would be simpler, faster, and more readable to use if str(s) == 'some string': do_something() John, do you see any problems with this? I think str(s) is guaranteed to return a string--that is, not to fail--for any python object, correct? Eric
Ryan May wrote: > Hi, > > I noticed that offset_copy() went away in the transforms rewrite and was > replaced with a trans + transfroms.Affine2D().translate(x,y). This > works fine for x,y in pixels. However, offset_copy would also let you > specify x,y in points. How can I get that to work with the new > transforms? More importantly, can I do it without knowing the dpi? Also, it looks like examples/pylab_examples/transoffset.py is broken... (I think more and more we need to automatically run the examples and compare against "known good" images in svn as a form of automated testing.) -Andrew
I've been to trying add more flexible control over the axis lines, ticks, tick labels, etc., but I think I'm in over my head on this project. Again and again, I've settled on one approach only to completely rewrite it the next time I look at the code. I'm looking for some major design advice/ideas that will make the code simpler (AND maintain compatibility with the current implementation of Axis and Tick objects). For simplicity, I've only made changes to the x-axis. Since this involves some major changes, I wrote this as a module that subclasses XTicks and XAxis as opposed to a patch on axis.py (which would be the ultimate, but at this point unlikely, goal). The attached module is quite long at the moment, but most of it is comments and methods copied (but slightly modified) from `axis.py`. I tried to document how/ why the method has changed if I copied the method. Sorry for the code dump. -Tony
On Mon, Jul 21, 2008 at 05:30:00AM -0500, John Hunter wrote: > On Mon, Jul 21, 2008 at 4:07 AM, Gael Varoquaux > <gae...@no...> wrote: > > Could somebody review this patch and possibly check it in? It is not > > perfect but is, IMHO, a good start that works on everything I have thrown > OK, looks good; I've committed it to svn r5798. I have commented out > the tkinter import until we figure out what if anything we need to do > in that case. Thanks John. Sorry for leaving the tk part behind. It should indeed be commented out. Gaël
On Mon, Jul 21, 2008 at 10:17 AM, David Kaplan <Dav...@ir...> wrote: > Similar level of question: What is the policy on using scipy in > matplotlib? I want to use linear interpolation, and > simple_linear_interpolation in the cbook doesn't do what I want. I > imagine that we are trying to avoid dependence on scipy. Yes, to date we've avoided a scipy dependence, which is a nuisance for developers, but makes installation a bit easier. JDH
Hi, Similar level of question: What is the policy on using scipy in matplotlib? I want to use linear interpolation, and simple_linear_interpolation in the cbook doesn't do what I want. I imagine that we are trying to avoid dependence on scipy. Thanks, David On Mon, 2008年07月21日 at 08:42 -0500, John Hunter wrote: > On Mon, Jul 21, 2008 at 7:51 AM, David Kaplan <Dav...@ir...> wrote: > > > 2) Can someone explain to me why is_string_like in the cbook doesn't > > just do isinstance(obj,str)? Is there anything "string like" that won't > > be caught be this isinstance call? > > In [65]: s = u'jdh' > > In [66]: isinstance(s, str) > Out[66]: False > > In [67]: isinstance(s, unicode) > Out[67]: True > > So we could check for str or unicode, but a user may be using a custom > string like class from some c++ extension code that is part of a large > in house API. The point is that we don't care if it *is* a string, we > just want it to act like a string > > http://en.wikipedia.org/wiki/Duck_typing -- ********************************** David M. Kaplan Charge de Recherche 1 Institut de Recherche pour le Developpement Centre de Recherche Halieutique Mediterraneenne et Tropicale av. Jean Monnet B.P. 171 34203 Sete cedex France Phone: +33 (0)4 99 57 32 27 Fax: +33 (0)4 99 57 32 95 http://www.ur097.ird.fr/team/dkaplan/index.html **********************************
On Mon, Jul 21, 2008 at 7:51 AM, David Kaplan <Dav...@ir...> wrote: > 2) Can someone explain to me why is_string_like in the cbook doesn't > just do isinstance(obj,str)? Is there anything "string like" that won't > be caught be this isinstance call? In [65]: s = u'jdh' In [66]: isinstance(s, str) Out[66]: False In [67]: isinstance(s, unicode) Out[67]: True So we could check for str or unicode, but a user may be using a custom string like class from some c++ extension code that is part of a large in house API. The point is that we don't care if it *is* a string, we just want it to act like a string http://en.wikipedia.org/wiki/Duck_typing