You can subscribe to this list here.
2003 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(3) |
Jun
|
Jul
|
Aug
(12) |
Sep
(12) |
Oct
(56) |
Nov
(65) |
Dec
(37) |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2004 |
Jan
(59) |
Feb
(78) |
Mar
(153) |
Apr
(205) |
May
(184) |
Jun
(123) |
Jul
(171) |
Aug
(156) |
Sep
(190) |
Oct
(120) |
Nov
(154) |
Dec
(223) |
2005 |
Jan
(184) |
Feb
(267) |
Mar
(214) |
Apr
(286) |
May
(320) |
Jun
(299) |
Jul
(348) |
Aug
(283) |
Sep
(355) |
Oct
(293) |
Nov
(232) |
Dec
(203) |
2006 |
Jan
(352) |
Feb
(358) |
Mar
(403) |
Apr
(313) |
May
(165) |
Jun
(281) |
Jul
(316) |
Aug
(228) |
Sep
(279) |
Oct
(243) |
Nov
(315) |
Dec
(345) |
2007 |
Jan
(260) |
Feb
(323) |
Mar
(340) |
Apr
(319) |
May
(290) |
Jun
(296) |
Jul
(221) |
Aug
(292) |
Sep
(242) |
Oct
(248) |
Nov
(242) |
Dec
(332) |
2008 |
Jan
(312) |
Feb
(359) |
Mar
(454) |
Apr
(287) |
May
(340) |
Jun
(450) |
Jul
(403) |
Aug
(324) |
Sep
(349) |
Oct
(385) |
Nov
(363) |
Dec
(437) |
2009 |
Jan
(500) |
Feb
(301) |
Mar
(409) |
Apr
(486) |
May
(545) |
Jun
(391) |
Jul
(518) |
Aug
(497) |
Sep
(492) |
Oct
(429) |
Nov
(357) |
Dec
(310) |
2010 |
Jan
(371) |
Feb
(657) |
Mar
(519) |
Apr
(432) |
May
(312) |
Jun
(416) |
Jul
(477) |
Aug
(386) |
Sep
(419) |
Oct
(435) |
Nov
(320) |
Dec
(202) |
2011 |
Jan
(321) |
Feb
(413) |
Mar
(299) |
Apr
(215) |
May
(284) |
Jun
(203) |
Jul
(207) |
Aug
(314) |
Sep
(321) |
Oct
(259) |
Nov
(347) |
Dec
(209) |
2012 |
Jan
(322) |
Feb
(414) |
Mar
(377) |
Apr
(179) |
May
(173) |
Jun
(234) |
Jul
(295) |
Aug
(239) |
Sep
(276) |
Oct
(355) |
Nov
(144) |
Dec
(108) |
2013 |
Jan
(170) |
Feb
(89) |
Mar
(204) |
Apr
(133) |
May
(142) |
Jun
(89) |
Jul
(160) |
Aug
(180) |
Sep
(69) |
Oct
(136) |
Nov
(83) |
Dec
(32) |
2014 |
Jan
(71) |
Feb
(90) |
Mar
(161) |
Apr
(117) |
May
(78) |
Jun
(94) |
Jul
(60) |
Aug
(83) |
Sep
(102) |
Oct
(132) |
Nov
(154) |
Dec
(96) |
2015 |
Jan
(45) |
Feb
(138) |
Mar
(176) |
Apr
(132) |
May
(119) |
Jun
(124) |
Jul
(77) |
Aug
(31) |
Sep
(34) |
Oct
(22) |
Nov
(23) |
Dec
(9) |
2016 |
Jan
(26) |
Feb
(17) |
Mar
(10) |
Apr
(8) |
May
(4) |
Jun
(8) |
Jul
(6) |
Aug
(5) |
Sep
(9) |
Oct
(4) |
Nov
|
Dec
|
2017 |
Jan
(5) |
Feb
(7) |
Mar
(1) |
Apr
(5) |
May
|
Jun
(3) |
Jul
(6) |
Aug
(1) |
Sep
|
Oct
(2) |
Nov
(1) |
Dec
|
2018 |
Jan
|
Feb
|
Mar
|
Apr
(1) |
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2020 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(1) |
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2025 |
Jan
(1) |
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
S | M | T | W | T | F | S |
---|---|---|---|---|---|---|
|
|
|
|
1
(19) |
2
(28) |
3
(8) |
4
(15) |
5
(20) |
6
(23) |
7
(12) |
8
(11) |
9
(13) |
10
(4) |
11
(9) |
12
(34) |
13
(33) |
14
(24) |
15
(15) |
16
(12) |
17
(8) |
18
(5) |
19
(5) |
20
(6) |
21
(10) |
22
(9) |
23
(18) |
24
(10) |
25
(7) |
26
(13) |
27
(18) |
28
(29) |
29
(4) |
30
(5) |
31
(2) |
hi all, i am trying to install the recent matplotlib (0.99.1.1) but i am getting an error about wxPython not being available. i thought wx is an optional backend? in any case i would not like to use it. is there a way to install matplotlib without? my command was: python setup.py install --prefix=/my/local/dir which yields: ================ BUILDING MATPLOTLIB matplotlib: 0.99.1.1 python: 2.5.2 (r252:60911, Jul 22 2009, 15:33:10) [GCC 4.2.4 (Ubuntu 4.2.4-1ubuntu3)] platform: linux2 REQUIRED DEPENDENCIES numpy: 1.3.0rc2 freetype2: found, but unknown version (no pkg-config) OPTIONAL BACKEND DEPENDENCIES libpng: found, but unknown version (no pkg-config) Tkinter: no * TKAgg requires Tkinter wxPython: no * wxPython not found Traceback (most recent call last): File "setup.py", line 146, in <module> import wx ImportError: No module named wx thanks.
VáclavŠmilauer wrote: > I checked the official terminology, it is a kernel average smoother > (in the sense of [1]) with special weight function exp(-(x-x0)^2/const), > operating on irregularly-spaced data in 2d. > > I am not sure if that is the same as what scipy.stats.kde.gaussian_kde does, > the documentation is terse. Can I be enlightened here? It looks like they are both part of a similar class of methods: http://en.wikipedia.org/wiki/Kernel_%28statistics%29 As such, it makes sense to me to have them all in SciPy, sharing code and API. -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chr...@no...
On 2009年10月05日 15:54 PM, VáclavŠmilauer wrote: > I am not sure if that is the same as what scipy.stats.kde.gaussian_kde does, > the documentation is terse. Can I be enlightened here? gaussian_kde does kernel density estimation. While many of the intermediate computations are similar, they have entirely different purposes. http://en.wikipedia.org/wiki/Kernel_density_estimation -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco
> >> about a year ago I developed for my own purposes a routine for averaging > >> irregularly-sampled data using gaussian average. > > > > is this similar to Kernel Density estimation? > > > > http://www.scipy.org/doc/api_docs/SciPy.stats.kde.gaussian_kde.html > > No. It is probably closer to radial basis function interpolation (in fact, it > almost certainly is a form of RBFs): > > http://docs.scipy.org/doc/scipy/reference/tutorial/interpolate.html#id1 I checked the official terminology, it is a kernel average smoother (in the sense of [1]) with special weight function exp(-(x-x0)^2/const), operating on irregularly-spaced data in 2d. I am not sure if that is the same as what scipy.stats.kde.gaussian_kde does, the documentation is terse. Can I be enlightened here? Cheers, Vaclav [1] http://en.wikipedia.org/wiki/Kernel_smoothing
hi, is it possible to draw unfilled scatter points? i want to use an unfilled 'o' marker with a dashed border to represent the 'expected' data, as opposed to the observed data. i can make the border dashed but haven't figured out how to make the marker not filled. any idea? thanks! -- Ernest
Hi everybody, I try to move an instance of matplotlib.lines.Line2D from one subplot to another. How to do that? I have tried the following code, but the Line2D does not appear on the second subplot. Thanks in advance, Julien ############### from pylab import * ion() f = figure() s = f.add_subplot("211") line = matplotlib.lines.Line2D([0,1],[0,1],color='m') s.add_line( line ) s2 = f.add_subplot("212") draw() raw_input( 'press a key to remove the line and try to add it to second subplot' ) line.remove() s2.add_line( line ) draw() raw_input('press a key to quit') ##################### -- python -c "print ''.join([chr(154 - ord(c)) for c in '*9(9&(18%.\ 9&1+,\'Z4(55l4('])" "When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong." (first law of AC Clarke)
For some reason, my earlier reply didn't seem to make it to the mailing list. Here it is in its entirety: """ If you assign each figure to a new number, it will keep all of those figures around in memory (because pyplot thinks you may want to use it again.) The best route is to call close('all') or fig.close() with each loop iteration. 40MB per image doesn't sound way out of reason to me. How big are your images? """ On 10/05/2009 03:46 AM, Leo Trottier wrote: > Hi, > > I think I've figured out what's going on. It's a combination of things: > > 1) iPython is ignorant of the problems associated with caching massive > data output > 2) iPython doesn't seem to have a good way to clear data from memory > reliably (https://bugs.launchpad.net/ipython/+bug/412350) iPython is designed for interactive use, and stores a lot of values so they can be conveniently reused later. For long running "batch" scripts, you can use "regular" Python, or run the code in iPython such that it isn't displayed at the console (by using "import" or "%run"). Bug 2) may help looks like it would still require some manual intervention to be usefull. You're still using a tool designed for fine-grained interactive use (eg. a pen) where one designed for automation may be more appropriate (eg. a laser printer) :) > 3) matplotlib/Python seems to be insufficiently aggressive in its > garbage collection (??) Is that still true after forcibly closing the figures on each loop iteration as I suggested? Many hours have been spent squashing memory leaks in matplotlib, and I am not aware of any in at least 0.98 and later (other than some unavoidable small leaks in certain GUI backends). Do you have a standalone example that illustrates this on a recent version of matplotlib? > 4) For obvious reasons, JPGs are much bigger when stored as arrays > (though they still seem to take up more memory than they should) It's pretty easy to estimate the memory requirements for an image. If the image is true-color (by this, I mean not color-mapped), you'll need 4-bytes-per-pixel for the original image, plus a cached scaled copy (the size of which depends on the output dpi), again with 4 bytes per pixel. For color-mapped images, you'll have 4-byte floats for each pixel, 4-byte rgba for the color-mapped image, and again a cached scaled copy of that. Not knowing the size of your input images, it's impossible to say if 40MB per image is way too big or not, but it's not unheard of by any means. > > Problems 1-3 seem problematic enough that they will get fixed eventually. > > ... but (4) is a design issue. Assuming it's possible, it looks like > there could be benefits to making an array-like wrapper around PIL > image objects (perhaps similar in principle to a sparse matrix). > Given PIL.ImageMath, ImagePath, etc., it seems actually fairly > doable. Wouldn't something like this be of major benefit to people > using SciPy for anything image-related? Are you suggesting decompressing the JPEG on-the-fly with each redraw? I'm not certain that would be fast enough for interactive use. It may be worth experimenting with, but it would require a lot of changes to how matplotlib works. It's also very tricky to get right -- I'm not aware of any image processing applications that don't ultimately store a dense matrix of uncompressed image data in memory, except for something like compressed OpenGL textures on a graphics card. PIL certainly doesn't retain the compressed JPEG in memory. So, I'm not sure the cost/benefit tradeoff is right here -- the problems it solves can be solved much more easily without sacrificing speed in other ways. That is, if the image data is simply too large, it can be scaled before feeding it to imshow(). And generating multiple figures in batch is not a problem if the figure is explicitly closed. Hope this helps. I would like to get to the bottom of any memory leaks, so if you can provide a standalone script that leaks, despite calling figure.close() in each iteration, please let me know. Cheers, Mike > > Leo > > On Fri, Oct 2, 2009 at 7:45 AM, Michael Droettboom <md...@st... > <mailto:md...@st...>> wrote: > > If you assign each figure to a new number, it will keep all of > those figures around in memory (because pyplot thinks you may want > to use it again.) The best route is to call close('all') or > fig.close() with each loop iteration. > > 40MB per image doesn't sound way out of reason to me. How big are > your images? > > Mike > > > On 10/01/2009 10:25 PM, Leo Trottier wrote: >> I have a friend who's having strange memory issues when opening >> and displaying images (using Matplotlib). >> >> Here's what he says: >> ####################################### >> >> pylab seems really inefficient: Opening a few images and >> displaying them eats up tons of memory, and the memory doesn't >> get freed. >> >> Starting python, and run >> >> In [5]: from glob import *; >> >> In [6]: from pylab import * >> >> python has 33MB of memory. >> >> >> Run >> >> In [7]: i = 1 >> >> In [8]: for imname in glob("*.JPG"): >> ...: im = imread(imname) >> ...: figure(i); i = i+1 >> ...: imshow(im) >> ...: >> >> This opens 10 figures and displays them. Python takes 480MB of >> memory. This is crazy, for 10 images -- 40+MB of memory for each! >> >> In [14]: close("all") >> >> In [15]: i = 1 >> >> In [16]: for imname in glob("*.JPG"): >> im = imread(imname) >> figure(i); i = i+1 >> imshow(im) >> ....: >> ....: >> >> This closes all figures and opens them again. Python takes up >> 837MB of memory. >> >> and so on... Something is really wrong with memory management. >> >> ##### System info: ############## >> >> (using macosx backend) >> >> 2.4GHz MacBook Pro Intel Core 2 Duo >> >> 4GB 667MHz DDR2 SDRAM >> >> In [5]: sys.version >> Out[5]: '2.6.2 (r262:71600, Oct 1 2009, 16:44:23) \n[GCC 4.2.1 >> (Apple Inc. build 5646)]' >> >> In [6]: numpy.__version__ >> Out[6]: '1.3.0' >> >> In [7]: matplotlib.__version__ >> Out[7]: '0.99.1.1' >> >> In [8]: scipy.__version__ >> Out[8]: '0.7.1' >> >> In [9]: >> >> >> ------------------------------------------------------------------------ >> >> ------------------------------------------------------------------------------ >> Come build with us! The BlackBerry® Developer Conference in SF, CA >> is the only developer event you need to attend this year. Jumpstart your >> developing skills, take BlackBerry mobile applications to market and stay >> ahead of the curve. Join us from November 9-12, 2009. Register now! >> http://p.sf.net/sfu/devconf >> ------------------------------------------------------------------------ >> >> _______________________________________________ >> Matplotlib-users mailing list >> Mat...@li... <mailto:Mat...@li...> >> https://lists.sourceforge.net/lists/listinfo/matplotlib-users >> > >
Scott Sinclair wrote: >> 2009年10月1日 ringobelingo <rin...@gm...>: >> I would like to add coastlines to a map but do not want interior >> 'coastlines'. At present, without them my continents are not distinct enough >> from the data I am plotting in the background. But, when I draw them using >> drawcoastlines(), I also get e.g. the great lakes showing up, and this just >> makes my world maps look messy and distracts from my data which should be >> the main focus. Does anyone know of a way to force this to happen, or is it >> something that might be added, i.e. splitting the function into >> drawcoastlines() and drawicoastlines() ... ? >> > > The easiest way to address this is to obtain (or make) a GIS shapefile > that has the coastlines you'd like to see. Then use the > readshapefile() method on your Basemap object instead of the > drawcoastlines() method. > > my_map = Basemap(...) > my_map.readshapefile(my_shpfile_name) > > Cheers, > Scott > > You can also use the 'area_thresh' keyword to control the minimum size of coastline features that are drawn. area_thresh=1000000 makes the Great Lakes disappear, at the expense of making all the islands except Greenland disappear too. If that's unacceptable, you'll have to do as Scott suggests and create a shapfile without the lakes. -Jeff
Hi, I think I've figured out what's going on. It's a combination of things: 1) iPython is ignorant of the problems associated with caching massive data output 2) iPython doesn't seem to have a good way to clear data from memory reliably (https://bugs.launchpad.net/ipython/+bug/412350) 3) matplotlib/Python seems to be insufficiently aggressive in its garbage collection (??) 4) For obvious reasons, JPGs are much bigger when stored as arrays (though they still seem to take up more memory than they should) Problems 1-3 seem problematic enough that they will get fixed eventually. ... but (4) is a design issue. Assuming it's possible, it looks like there could be benefits to making an array-like wrapper around PIL image objects (perhaps similar in principle to a sparse matrix). Given PIL.ImageMath, ImagePath, etc., it seems actually fairly doable. Wouldn't something like this be of major benefit to people using SciPy for anything image-related? Leo On Fri, Oct 2, 2009 at 7:45 AM, Michael Droettboom <md...@st...> wrote: > If you assign each figure to a new number, it will keep all of those > figures around in memory (because pyplot thinks you may want to use it > again.) The best route is to call close('all') or fig.close() with each > loop iteration. > > 40MB per image doesn't sound way out of reason to me. How big are your > images? > > Mike > > > On 10/01/2009 10:25 PM, Leo Trottier wrote: > > I have a friend who's having strange memory issues when opening and > displaying images (using Matplotlib). > > Here's what he says: > ####################################### > > pylab seems really inefficient: Opening a few images and displaying them > eats up tons of memory, and the memory doesn't get freed. > > Starting python, and run > > In [5]: from glob import *; > > In [6]: from pylab import * > > python has 33MB of memory. > > > Run > > In [7]: i = 1 > > In [8]: for imname in glob("*.JPG"): > ...: im = imread(imname) > ...: figure(i); i = i+1 > ...: imshow(im) > ...: > > This opens 10 figures and displays them. Python takes 480MB of memory. This > is crazy, for 10 images -- 40+MB of memory for each! > > In [14]: close("all") > > In [15]: i = 1 > > In [16]: for imname in glob("*.JPG"): > im = imread(imname) > figure(i); i = i+1 > imshow(im) > ....: > ....: > > This closes all figures and opens them again. Python takes up 837MB of > memory. > > and so on... Something is really wrong with memory management. > > ##### System info: ############## > > (using macosx backend) > > 2.4GHz MacBook Pro Intel Core 2 Duo > > 4GB 667MHz DDR2 SDRAM > > In [5]: sys.version > Out[5]: '2.6.2 (r262:71600, Oct 1 2009, 16:44:23) \n[GCC 4.2.1 (Apple Inc. > build 5646)]' > > In [6]: numpy.__version__ > Out[6]: '1.3.0' > > In [7]: matplotlib.__version__ > Out[7]: '0.99.1.1' > > In [8]: scipy.__version__ > Out[8]: '0.7.1' > > In [9]: > > > ------------------------------ > > ------------------------------------------------------------------------------ > Come build with us! The BlackBerry® Developer Conference in SF, CA > is the only developer event you need to attend this year. Jumpstart your > developing skills, take BlackBerry mobile applications to market and stay > ahead of the curve. Join us from November 9-12, 2009. Register now!http://p.sf.net/sfu/devconf > > ------------------------------ > > _______________________________________________ > Matplotlib-users mailing lis...@li...https://lists.sourceforge.net/lists/listinfo/matplotlib-users > > >
>2009年10月1日 ringobelingo <rin...@gm...>: > I would like to add coastlines to a map but do not want interior > 'coastlines'. At present, without them my continents are not distinct enough > from the data I am plotting in the background. But, when I draw them using > drawcoastlines(), I also get e.g. the great lakes showing up, and this just > makes my world maps look messy and distracts from my data which should be > the main focus. Does anyone know of a way to force this to happen, or is it > something that might be added, i.e. splitting the function into > drawcoastlines() and drawicoastlines() ... ? The easiest way to address this is to obtain (or make) a GIS shapefile that has the coastlines you'd like to see. Then use the readshapefile() method on your Basemap object instead of the drawcoastlines() method. my_map = Basemap(...) my_map.readshapefile(my_shpfile_name) Cheers, Scott
Phillip M. Feldman wrote: > Hello Eric- > > When I look at http://matplotlib.sourceforge.net/api/colors_api.html, > the documentation for LinearSegmentedColormap indicates that there is a > parameter called "N" but does not specify what this parameter does. > Thanks for clarifying this. I will submit a patch to the documentation. > > My main problem with ListedColormap is that the documentation for that > function does not specify whether the resulting colormap is discrete > (piecewise-constant) or continuous (piecewise-linear). I strongly > suspect that ListedColormap can only be used to generate discrete > colormaps, and I need both types. Yes, ListedColormap is for discrete maps, and LinearSegmentedColormap is for piecewise-linear (to within the resolution provided by the size of the look-up table). There is a new staticmethod of the latter, from_list(), which simplifies the colormap generation for the very common case you are concerned with. It may be that its API needs some tweaking; it probably needs an example in the examples directory (if there isn't one; etc. I don't agree with your comment about the ListedColormap documentation; I think it is reasonably clear, if you read the whole docstring, and especially the description of N. Eric > > Phillip > > Eric Firing wrote: >> Dr. Phillip M. Feldman wrote: >>> The attached script creates a colormap containing five colors. At the >>> end of >>> the script, I print the value of cmap.N, and it is 256 rather than 5. >>> >>> http://www.nabble.com/file/p25740788/bugdemo.py bugdemo.py >> >> No, it is not a bug. You never told LinearSegmentedColormap that you >> wanted a map with other than 256 colors, which is the default. If you >> want a colormap with 3 colors, use a ListedColormap. In fact, as I >> wrote before, this, together with a BoundaryNorm, is the best way to >> handle the mapping of discrete colors. >> >> Eric >> >
Hello Eric- When I look at http://matplotlib.sourceforge.net/api/colors_api.html, the documentation for LinearSegmentedColormap indicates that there is a parameter called "N" but does not specify what this parameter does. Thanks for clarifying this. I will submit a patch to the documentation. My main problem with ListedColormap is that the documentation for that function does not specify whether the resulting colormap is discrete (piecewise-constant) or continuous (piecewise-linear). I strongly suspect that ListedColormap can only be used to generate discrete colormaps, and I need both types. Phillip Eric Firing wrote: > Dr. Phillip M. Feldman wrote: >> The attached script creates a colormap containing five colors. At the >> end of >> the script, I print the value of cmap.N, and it is 256 rather than 5. >> >> http://www.nabble.com/file/p25740788/bugdemo.py bugdemo.py > > No, it is not a bug. You never told LinearSegmentedColormap that you > wanted a map with other than 256 colors, which is the default. If you > want a colormap with 3 colors, use a ListedColormap. In fact, as I > wrote before, this, together with a BoundaryNorm, is the best way to > handle the mapping of discrete colors. > > Eric >
Jason Sewall wrote: > On Sun, Oct 4, 2009 at 11:45 PM, Eric Firing <ef...@ha...> wrote: >> Use the Axes.set_aspect() method for full control of the aspect ratio, and >> of what gets changed to preserve that aspect ratio. > > Thanks, that works great! Any ideas about Arc's effect on the 'tight' bounds? Sorry, I don't know anything at all about that offhand, and I can't look right now. Eric
On Sun, Oct 4, 2009 at 11:45 PM, Eric Firing <ef...@ha...> wrote: > Use the Axes.set_aspect() method for full control of the aspect ratio, and > of what gets changed to preserve that aspect ratio. Thanks, that works great! Any ideas about Arc's effect on the 'tight' bounds?
Jason Sewall wrote: > While I'm at it, I might as well as about image dimensions vs. axes limits. > > If I change ax.axis('equal') manually to the correct bounding box of > the visible stuff (i.e. ax.axis([1.25, 5, 0, 6])), I then get an image > that is distorted, presumably to fit some pre-set aspect ratio. Is > there a way to fix this, too? I guess what I'd like is a way to say: > "When you save or render this image, set the output image dimensions > to be appropriate to aspect ratio x (usually 1:1 for me) and the > current axis setting" Use the Axes.set_aspect() method for full control of the aspect ratio, and of what gets changed to preserve that aspect ratio. Eric > > I've used matplotlib for more than a few projects, but this aspect of > it has eluded me. > > Thanks for all your help, > Jason
On Sun, Oct 4, 2009 at 10:57 PM, Jae-Joon Lee <lee...@gm...> wrote: > MPL, by default, does not do any clipping. So, i'm not sure why the > extent of the circle matters (I think even the bbox_inches option does > not take care of that) . Everything should be fine as far as you > create a figure in an appropriate size. Changing the suplotplot > parameters (or something similar) is not your option? > If possible, can you post a simplified version of your code that > demonstrate the problem? Sure: -------- bounds.py --------- #!/usr/bin/python import matplotlib import pylab if __name__ == '__main__': pylab.clf() ax = pylab.axes([0,0,1,1]) ax.add_line(matplotlib.lines.Line2D((2, 5), (0.75, 6))) ax.add_patch(matplotlib.patches.Arc((2, 0), 1.5, 1.5, 0.0, 0, 180.0, fill=False)) ax.axis('equal') pylab.savefig("fig-a.pdf", bbox_inches='tight', pad_inches=0) -------------------------------- What this does is make a stupid little picture of a hemi-circle and and line going off somewhere. At any rate, the y dimensions of this image are something like [-0.75, 6], while I'd expect it to be [0, 6] - the 'imaginary' lower half of the circle is being used to compute the final axes size. The y dimensions do no change if I choose ax.axis('tight') While I'm at it, I might as well as about image dimensions vs. axes limits. If I change ax.axis('equal') manually to the correct bounding box of the visible stuff (i.e. ax.axis([1.25, 5, 0, 6])), I then get an image that is distorted, presumably to fit some pre-set aspect ratio. Is there a way to fix this, too? I guess what I'd like is a way to say: "When you save or render this image, set the output image dimensions to be appropriate to aspect ratio x (usually 1:1 for me) and the current axis setting" I've used matplotlib for more than a few projects, but this aspect of it has eluded me. Thanks for all your help, Jason
use "extent" argument. http://matplotlib.sourceforge.net/api/pyplot_api.html#matplotlib.pyplot.imshow -JJ On Fri, Oct 2, 2009 at 7:21 PM, shamster <eri...@gm...> wrote: > > My current plot takes data to construct a 2d histogram. In gnuplot i would > accomplish this by using splot, dgrid3d, and pulling in a 'matrix' data > file. The code below has produced nearly what I need. However, the axes > limits are set based on the indices of the incoming data (i.e. the number of > rows or columns in my data matrix) instead of something meaningful. For > example, my x-axis is set from 0 to 2000, but I'd like it to span from -1 to > +1 because my x-data is a cosine function... > > I would imagine that some type of axis-scaling function would take a > function to scale the labels... something akin to: > > scale_axis(lambda x: x*.001-1.0, ax) > > would do the trick of scaling my (0, 2000) data to the (-1.0, 1.0) extents. > Is there any such functionality to actually scale the values of the axis > tick-labels? > > > > > <code snippet> > > 1 import sys > 2 from scipy import * > 3 import scipy.io.array_import > 4 > 5 # plotting libs > 6 import matplotlib.pyplot as plt > 7 from pylab import * > 8 > 9 > 10 > 11 file = scipy.io.array_import.read_array(sys.argv[1]) > 12 data = [] > 13 > 14 for i in range(len(file[0])): > 15 data.append(file[:,i]) > 16 > 17 # create a figure > 18 fig = figure(1) > 19 > 20 cmBase = cm.jet # choose a colormap to use for the plot > 21 > 22 # This processes the data and adds the colored 2D histogram to the > figure > 23 im = imshow(data, interpolation='bilinear') > 24 # A little magic to create a colorbar - legend for the plot > 25 plt.colorbar(im) > 26 > 27 # Turn on the X-Y grid > 28 grid(True) > 29 > 30 # Pass go - collect 200ドル > 31 draw() > 32 show() > > </code snippet> > -- > View this message in context: http://www.nabble.com/Scaling-the-axis-values-from-list-indices-to-meaningful-values-tp25724127p25724127.html > Sent from the matplotlib - users mailing list archive at Nabble.com. > > > ------------------------------------------------------------------------------ > Come build with us! The BlackBerry® Developer Conference in SF, CA > is the only developer event you need to attend this year. Jumpstart your > developing skills, take BlackBerry mobile applications to market and stay > ahead of the curve. Join us from November 9-12, 2009. Register now! > http://p.sf.net/sfu/devconf > _______________________________________________ > Matplotlib-users mailing list > Mat...@li... > https://lists.sourceforge.net/lists/listinfo/matplotlib-users >
MPL, by default, does not do any clipping. So, i'm not sure why the extent of the circle matters (I think even the bbox_inches option does not take care of that) . Everything should be fine as far as you create a figure in an appropriate size. Changing the suplotplot parameters (or something similar) is not your option? If possible, can you post a simplified version of your code that demonstrate the problem? Regards, -JJ On Sun, Oct 4, 2009 at 6:32 PM, Jason Sewall <jas...@gm...> wrote: > I'm drawing a figure for inclusion into a LaTeX document, and I'm > having trouble getting matplotlib to export a .pdf that 'tightly' > encapsulates the drawing that is generated - I get a lot of whitespace > at the bottom, in particular. > > After some experimentation, I've found that the culprit is an Arc that > I am drawing near the bottom of the page. The angle range is something > like pi/4 to 3*pi/4, so the actual line is located well above the > center of the circle defining the arc, but it appears that the arc is > reporting a bounding box that is the same as would be reported for a > circle. > > Perhaps this isn't a bug and this behavior is useful to some, but I'd > really like to be able to get the resulting image fit tightly around > the drawn lines and not invisible bounding boxes. > > Can anyone help me out? Is there some way I can force matplotlib to > 'clip' the output? > > Thanks, > Jason > > P.S. I'm using matplotlib v. 0.98.5.2 on Python 2.6 running on x86_64 Linux > > ------------------------------------------------------------------------------ > Come build with us! The BlackBerry® Developer Conference in SF, CA > is the only developer event you need to attend this year. Jumpstart your > developing skills, take BlackBerry mobile applications to market and stay > ahead of the curve. Join us from November 9-12, 2009. Register now! > http://p.sf.net/sfu/devconf > _______________________________________________ > Matplotlib-users mailing list > Mat...@li... > https://lists.sourceforge.net/lists/listinfo/matplotlib-users >
Ryan May wrote: > On Sun, Oct 4, 2009 at 4:21 PM, Robert Kern <rob...@gm...> wrote: >> On 2009年10月04日 15:27 PM, Christopher Barker wrote: >>> Václav Šmilauer wrote: >>> >>>> about a year ago I developed for my own purposes a routine for averaging >>>> irregularly-sampled data using gaussian average. >>> is this similar to Kernel Density estimation? >>> >>> http://www.scipy.org/doc/api_docs/SciPy.stats.kde.gaussian_kde.html >> No. It is probably closer to radial basis function interpolation (in fact, it >> almost certainly is a form of RBFs): >> >> http://docs.scipy.org/doc/scipy/reference/tutorial/interpolate.html#id1 > > Except in radial basis function interpolation, you solve for the > weights that give the original values at the original data points. > Here, it's just a inverse-distance weighted average, where the weights > are chosen using an exp(-x^2/A) relation. There's a huge difference > between the two when you're dealing with data with noise. Fair point. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco
On Sun, Oct 4, 2009 at 4:21 PM, Robert Kern <rob...@gm...> wrote: > On 2009年10月04日 15:27 PM, Christopher Barker wrote: >> Václav Šmilauer wrote: >> >>> about a year ago I developed for my own purposes a routine for averaging >>> irregularly-sampled data using gaussian average. >> >> is this similar to Kernel Density estimation? >> >> http://www.scipy.org/doc/api_docs/SciPy.stats.kde.gaussian_kde.html > > No. It is probably closer to radial basis function interpolation (in fact, it > almost certainly is a form of RBFs): > > http://docs.scipy.org/doc/scipy/reference/tutorial/interpolate.html#id1 Except in radial basis function interpolation, you solve for the weights that give the original values at the original data points. Here, it's just a inverse-distance weighted average, where the weights are chosen using an exp(-x^2/A) relation. There's a huge difference between the two when you're dealing with data with noise. Ryan -- Ryan May Graduate Research Assistant School of Meteorology University of Oklahoma