SourceForge logo
SourceForge logo
Menu

matplotlib-checkins — Commit notification. DO NOT POST to this list, just subscribe to it.

You can subscribe to this list here.

2007 Jan
Feb
Mar
Apr
May
Jun
Jul
(115)
Aug
(120)
Sep
(137)
Oct
(170)
Nov
(461)
Dec
(263)
2008 Jan
(120)
Feb
(74)
Mar
(35)
Apr
(74)
May
(245)
Jun
(356)
Jul
(240)
Aug
(115)
Sep
(78)
Oct
(225)
Nov
(98)
Dec
(271)
2009 Jan
(132)
Feb
(84)
Mar
(74)
Apr
(56)
May
(90)
Jun
(79)
Jul
(83)
Aug
(296)
Sep
(214)
Oct
(76)
Nov
(82)
Dec
(66)
2010 Jan
(46)
Feb
(58)
Mar
(51)
Apr
(77)
May
(58)
Jun
(126)
Jul
(128)
Aug
(64)
Sep
(50)
Oct
(44)
Nov
(48)
Dec
(54)
2011 Jan
(68)
Feb
(52)
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
Dec
(1)
2018 Jan
Feb
Mar
Apr
May
(1)
Jun
Jul
Aug
Sep
Oct
Nov
Dec

Showing results of 5455

1 2 3 .. 219 > >> (Page 1 of 219)
From: aishwarya s. <ais...@gm...> - 2018年05月16日 14:21:28
Dear all,
I wanted to reciprocate the effect of :
 import matplotlib.pyplot as plt}
 plt.imshow(image1)
 plt.pause(0.7)
 plt.imshow(image2)
in a code making use of Tkinter and FigureCanvastkAgg;
Which lookes something like this :
f = Figure(figsize=(6,6))
 a = f.add_subplot(111)
 image = np.array(np.random.random((1024,1024))*100,dtype=int)
 a.imshow(image)
 canvas = FigureCanvasTkAgg(f, self)
 canvas.show()
 canvas.get_tk_widget().pack(side=tk.BOTTOM, fill=tk.BOTH, expand=True)`
If I want to plot another image2 in the same graph. How could I do it?
I tried another method but was not able to obtain the desired output:
> ​
>
 self.i = 0
 def Plot():
 print self.i
 self.f = Figure(figsize=(6,6))
 self.a = self.f.add_subplot(111)
 image = np.array(np.random.random((1024,1024))*100,dtype=int)
>
> if self.i <20:
 self.i = self.i+1
 self.a.imshow(image)
 self.canvas = FigureCanvasTkAgg(self.f, self)
 self.canvas.show()
 self.parent.after(500,Plot)
 Plot()
>
>
 self.canvas.get_tk_widget().pack(side=tk.BOTTOM, fill=tk.BOTH,
> expand=True)
 toolbar = NavigationToolbar2TkAgg(self.canvas, self)
 toolbar.update()
 self.canvas._tkcanvas.pack(side=tk.TOP, fill=tk.BOTH, expand=True)
​
>
>
Awaiting your reply.
-- 
Regards,
Aishwarya Selvaraj
From: <fer...@us...> - 2011年12月17日 21:49:43
Revision: 8989
 http://matplotlib.svn.sourceforge.net/matplotlib/?rev=8989&view=rev
Author: fer_perez
Date: 2011年12月17日 21:49:36 +0000 (2011年12月17日)
Log Message:
-----------
Sync SVN repo with contents in current git repo
Modified Paths:
--------------
 trunk/sample_data/setup.py
Added Paths:
-----------
 trunk/sample_data/moonlanding.png
 trunk/sample_data/screenshots/
 trunk/sample_data/screenshots/500hgtdata.gz
 trunk/sample_data/screenshots/500hgtlats.gz
 trunk/sample_data/screenshots/500hgtlons.gz
 trunk/sample_data/screenshots/chandra.dat
 trunk/sample_data/screenshots/eeg.dat
 trunk/sample_data/screenshots/etopo20data.gz
 trunk/sample_data/screenshots/etopo20lats.gz
 trunk/sample_data/screenshots/etopo20lons.gz
 trunk/sample_data/screenshots/hst.zdat
 trunk/sample_data/screenshots/intc.csv
 trunk/sample_data/screenshots/msft.csv
 trunk/sample_data/screenshots/s1045.ima
 trunk/sample_data/stinkbug.png
Added: trunk/sample_data/moonlanding.png
===================================================================
(Binary files differ)
Property changes on: trunk/sample_data/moonlanding.png
___________________________________________________________________
Added: svn:mime-type
 + application/octet-stream
Added: trunk/sample_data/screenshots/500hgtdata.gz
===================================================================
(Binary files differ)
Property changes on: trunk/sample_data/screenshots/500hgtdata.gz
___________________________________________________________________
Added: svn:mime-type
 + application/octet-stream
Added: trunk/sample_data/screenshots/500hgtlats.gz
===================================================================
(Binary files differ)
Property changes on: trunk/sample_data/screenshots/500hgtlats.gz
___________________________________________________________________
Added: svn:mime-type
 + application/octet-stream
Added: trunk/sample_data/screenshots/500hgtlons.gz
===================================================================
(Binary files differ)
Property changes on: trunk/sample_data/screenshots/500hgtlons.gz
___________________________________________________________________
Added: svn:mime-type
 + application/octet-stream
Added: trunk/sample_data/screenshots/chandra.dat
===================================================================
(Binary files differ)
Property changes on: trunk/sample_data/screenshots/chandra.dat
___________________________________________________________________
Added: svn:mime-type
 + application/octet-stream
Added: trunk/sample_data/screenshots/eeg.dat
===================================================================
(Binary files differ)
Property changes on: trunk/sample_data/screenshots/eeg.dat
___________________________________________________________________
Added: svn:mime-type
 + application/octet-stream
Added: trunk/sample_data/screenshots/etopo20data.gz
===================================================================
(Binary files differ)
Property changes on: trunk/sample_data/screenshots/etopo20data.gz
___________________________________________________________________
Added: svn:mime-type
 + application/octet-stream
Added: trunk/sample_data/screenshots/etopo20lats.gz
===================================================================
(Binary files differ)
Property changes on: trunk/sample_data/screenshots/etopo20lats.gz
___________________________________________________________________
Added: svn:mime-type
 + application/octet-stream
Added: trunk/sample_data/screenshots/etopo20lons.gz
===================================================================
(Binary files differ)
Property changes on: trunk/sample_data/screenshots/etopo20lons.gz
___________________________________________________________________
Added: svn:mime-type
 + application/octet-stream
Added: trunk/sample_data/screenshots/hst.zdat
===================================================================
(Binary files differ)
Property changes on: trunk/sample_data/screenshots/hst.zdat
___________________________________________________________________
Added: svn:mime-type
 + application/octet-stream
Added: trunk/sample_data/screenshots/intc.csv
===================================================================
--- trunk/sample_data/screenshots/intc.csv	 (rev 0)
+++ trunk/sample_data/screenshots/intc.csv	2011年12月17日 21:49:36 UTC (rev 8989)
@@ -0,0 +1,66 @@
+Date,Open,High,Low,Close,Volume,Adj. Close*
+19-Sep-03,29.30,29.36,28.83,29.17,53550300,29.15
+18-Sep-03,28.69,29.28,28.46,29.16,46619000,29.14
+17-Sep-03,28.94,29.38,28.77,28.88,52827300,28.86
+16-Sep-03,28.08,28.99,28.02,28.91,48748000,28.89
+15-Sep-03,28.33,28.45,27.91,27.99,36350400,27.97
+12-Sep-03,27.81,28.39,27.55,28.34,51931600,28.32
+11-Sep-03,27.62,28.35,27.29,28.03,56654900,28.01
+10-Sep-03,28.44,28.49,27.63,27.66,57999300,27.64
+9-Sep-03,29.00,29.13,28.66,28.79,49792900,28.77
+8-Sep-03,28.88,29.20,28.80,29.18,52154000,29.16
+5-Sep-03,28.83,29.10,28.37,28.71,68429904,28.69
+4-Sep-03,28.32,28.74,28.04,28.60,63744700,28.58
+3-Sep-03,28.94,28.97,28.04,28.22,60714800,28.20
+2-Sep-03,28.77,28.84,28.17,28.74,58841200,28.72
+29-Aug-03,28.18,28.65,28.04,28.59,41986600,28.57
+28-Aug-03,28.10,28.35,27.85,28.30,48631600,28.28
+27-Aug-03,27.61,28.08,27.42,28.02,58217200,28.00
+26-Aug-03,26.96,27.74,26.68,27.71,65213400,27.69
+25-Aug-03,27.56,27.76,27.07,27.24,52037500,27.22
+22-Aug-03,28.16,29.04,27.32,27.39,120604096,27.37
+21-Aug-03,26.69,26.78,26.01,26.39,66434900,26.37
+20-Aug-03,26.14,26.74,26.13,26.36,47210300,26.34
+19-Aug-03,26.37,26.54,25.92,26.47,55966300,26.45
+18-Aug-03,25.10,26.23,25.05,26.19,59081000,26.17
+15-Aug-03,25.09,25.25,24.81,25.05,21622800,25.04
+14-Aug-03,24.80,25.17,24.55,25.14,51830000,25.13
+13-Aug-03,24.50,25.00,24.30,24.71,51882500,24.70
+12-Aug-03,24.09,24.40,23.82,24.37,48475100,24.36
+11-Aug-03,23.62,24.13,23.58,23.90,41624600,23.89
+8-Aug-03,24.15,24.22,23.33,23.58,57453600,23.57
+7-Aug-03,23.94,24.30,23.86,23.99,48517800,23.98
+6-Aug-03,24.10,24.55,23.81,24.14,57799000,24.13
+5-Aug-03,25.12,25.12,24.23,24.27,51979800,24.26
+4-Aug-03,24.91,25.23,24.39,25.13,53570000,25.10
+1-Aug-03,24.78,25.07,24.73,25.02,48494900,24.99
+31-Jul-03,24.80,25.35,24.68,24.89,68692096,24.86
+30-Jul-03,24.86,24.87,24.28,24.49,40786200,24.46
+29-Jul-03,24.81,25.23,24.70,24.90,68217600,24.87
+28-Jul-03,24.92,25.13,24.61,24.76,45462200,24.73
+25-Jul-03,24.17,24.94,23.73,24.91,52627700,24.88
+24-Jul-03,25.15,25.17,23.95,23.97,58119100,23.94
+23-Jul-03,24.37,24.90,24.26,24.81,49058200,24.78
+22-Jul-03,24.39,24.53,24.05,24.42,61074300,24.39
+21-Jul-03,24.60,24.63,23.95,24.06,52851200,24.03
+18-Jul-03,25.10,25.15,24.15,24.66,65296900,24.63
+17-Jul-03,24.69,25.10,24.60,24.93,71736800,24.90
+16-Jul-03,25.24,25.50,24.82,25.31,128925696,25.27
+15-Jul-03,24.44,24.77,23.67,24.10,102323696,24.07
+14-Jul-03,24.27,24.58,23.85,24.02,76909400,23.99
+11-Jul-03,23.29,23.50,23.07,23.34,46535400,23.31
+10-Jul-03,23.07,23.30,22.61,22.91,63261600,22.88
+9-Jul-03,23.30,23.99,23.25,23.48,78521904,23.45
+8-Jul-03,22.83,23.40,22.67,23.15,64980800,23.12
+7-Jul-03,22.24,22.98,22.17,22.91,56553100,22.88
+3-Jul-03,21.97,22.31,21.71,21.72,40502400,21.69
+2-Jul-03,21.66,22.32,21.47,22.21,74291504,22.18
+1-Jul-03,20.87,21.50,20.51,21.41,64496600,21.38
+30-Jun-03,21.14,21.30,20.59,20.81,51457500,20.78
+27-Jun-03,20.70,21.13,20.53,20.57,63348200,20.54
+26-Jun-03,20.30,20.76,20.15,20.63,52904900,20.60
+25-Jun-03,20.53,20.83,19.99,20.04,61250600,20.01
+24-Jun-03,20.11,20.74,20.04,20.45,63799700,20.42
+23-Jun-03,20.70,20.97,20.05,20.36,59628100,20.33
+20-Jun-03,21.34,21.42,20.64,20.67,78909400,20.64
+19-Jun-03,21.66,21.92,21.12,21.12,69563696,21.09
Added: trunk/sample_data/screenshots/msft.csv
===================================================================
--- trunk/sample_data/screenshots/msft.csv	 (rev 0)
+++ trunk/sample_data/screenshots/msft.csv	2011年12月17日 21:49:36 UTC (rev 8989)
@@ -0,0 +1,66 @@
+Date,Open,High,Low,Close,Volume,Adj. Close*
+19-Sep-03,29.76,29.97,29.52,29.96,92433800,29.79
+18-Sep-03,28.49,29.51,28.42,29.50,67268096,29.34
+17-Sep-03,28.76,28.95,28.47,28.50,47221600,28.34
+16-Sep-03,28.41,28.95,28.32,28.90,52060600,28.74
+15-Sep-03,28.37,28.61,28.33,28.36,41432300,28.20
+12-Sep-03,27.48,28.40,27.45,28.34,55777200,28.18
+11-Sep-03,27.66,28.11,27.59,27.84,37813300,27.68
+10-Sep-03,28.03,28.18,27.48,27.55,54763500,27.40
+9-Sep-03,28.65,28.71,28.31,28.37,44315200,28.21
+8-Sep-03,28.39,28.92,28.34,28.84,46105300,28.68
+5-Sep-03,28.23,28.75,28.17,28.38,64024500,28.22
+4-Sep-03,28.10,28.47,27.99,28.43,59840800,28.27
+3-Sep-03,27.42,28.40,27.38,28.30,109437800,28.14
+2-Sep-03,26.70,27.30,26.47,27.26,74168896,27.11
+29-Aug-03,26.46,26.55,26.35,26.52,34503000,26.37
+28-Aug-03,26.50,26.58,26.24,26.51,46211200,26.36
+27-Aug-03,26.51,26.58,26.30,26.42,30633900,26.27
+26-Aug-03,26.31,26.67,25.96,26.57,47546000,26.42
+25-Aug-03,26.31,26.54,26.23,26.50,36132900,26.35
+22-Aug-03,26.78,26.95,26.21,26.22,65846300,26.07
+21-Aug-03,26.65,26.73,26.13,26.24,63802700,26.09
+20-Aug-03,26.30,26.53,26.00,26.45,56739300,26.30
+19-Aug-03,25.85,26.65,25.77,26.62,72952896,26.47
+18-Aug-03,25.56,25.83,25.46,25.70,45817400,25.56
+15-Aug-03,25.61,25.66,25.43,25.54,27607900,25.40
+14-Aug-03,25.66,25.71,25.52,25.63,37338300,25.49
+13-Aug-03,25.79,25.89,25.50,25.60,39636900,25.46
+12-Aug-03,25.71,25.77,25.45,25.73,38208400,25.59
+11-Aug-03,25.61,25.99,25.54,25.61,36433900,25.47
+8-Aug-03,25.88,25.98,25.50,25.58,33241400,25.44
+7-Aug-03,25.72,25.81,25.45,25.71,44258500,25.57
+6-Aug-03,25.54,26.19,25.43,25.65,56294900,25.51
+5-Aug-03,26.31,26.54,25.60,25.66,58825800,25.52
+4-Aug-03,26.15,26.41,25.75,26.18,51825600,26.03
+1-Aug-03,26.33,26.51,26.12,26.17,42649700,26.02
+31-Jul-03,26.60,26.99,26.31,26.41,64504800,26.26
+30-Jul-03,26.46,26.57,26.17,26.23,41240300,26.08
+29-Jul-03,26.88,26.90,26.24,26.47,62391100,26.32
+28-Jul-03,26.94,27.00,26.49,26.61,52658300,26.46
+25-Jul-03,26.28,26.95,26.07,26.89,54173000,26.74
+24-Jul-03,26.78,26.92,25.98,26.00,53556600,25.85
+23-Jul-03,26.42,26.65,26.14,26.45,49828200,26.30
+22-Jul-03,26.28,26.56,26.13,26.38,51791000,26.23
+21-Jul-03,26.87,26.91,26.00,26.04,48480800,25.89
+18-Jul-03,27.11,27.23,26.75,26.89,63388400,26.74
+17-Jul-03,27.14,27.27,26.54,26.69,72805000,26.54
+16-Jul-03,27.56,27.62,27.20,27.52,49838900,27.37
+15-Jul-03,27.47,27.53,27.10,27.27,53567600,27.12
+14-Jul-03,27.63,27.81,27.05,27.40,60464400,27.25
+11-Jul-03,26.95,27.45,26.89,27.31,50377300,27.16
+10-Jul-03,27.25,27.42,26.59,26.91,55350800,26.76
+9-Jul-03,27.56,27.70,27.25,27.47,62300700,27.32
+8-Jul-03,27.26,27.80,27.25,27.70,61896800,27.55
+7-Jul-03,27.02,27.55,26.95,27.42,88960800,27.27
+3-Jul-03,26.69,26.95,26.41,26.50,39440900,26.35
+2-Jul-03,26.50,26.93,26.45,26.88,94069296,26.73
+1-Jul-03,25.59,26.20,25.39,26.15,60926000,26.00
+30-Jun-03,25.94,26.12,25.50,25.64,48073100,25.50
+27-Jun-03,25.95,26.34,25.53,25.63,76040304,25.49
+26-Jun-03,25.39,26.51,25.21,25.75,51758100,25.61
+25-Jun-03,25.64,25.99,25.14,25.26,60483500,25.12
+24-Jun-03,25.65,26.04,25.52,25.70,51820300,25.56
+23-Jun-03,26.14,26.24,25.49,25.78,52584500,25.64
+20-Jun-03,26.34,26.38,26.01,26.33,86048896,26.18
+19-Jun-03,26.09,26.39,26.01,26.07,63626900,25.92
Added: trunk/sample_data/screenshots/s1045.ima
===================================================================
(Binary files differ)
Property changes on: trunk/sample_data/screenshots/s1045.ima
___________________________________________________________________
Added: svn:mime-type
 + application/octet-stream
Modified: trunk/sample_data/setup.py
===================================================================
--- trunk/sample_data/setup.py	2011年02月18日 19:28:29 UTC (rev 8988)
+++ trunk/sample_data/setup.py	2011年12月17日 21:49:36 UTC (rev 8989)
@@ -1,7 +1,7 @@
 from distutils.core import setup
 
 setup(name='mpl_sampledata',
- version='1.0.1',
+ version='1.1.0',
 description='matplotlib sample data',
 author='John Hunter',
 author_email='jd...@gm...',
Added: trunk/sample_data/stinkbug.png
===================================================================
(Binary files differ)
Property changes on: trunk/sample_data/stinkbug.png
___________________________________________________________________
Added: svn:mime-type
 + application/octet-stream
This was sent by the SourceForge.net collaborative development platform, the world's largest Open Source development site.
From: <wea...@us...> - 2011年02月18日 19:28:35
Revision: 8988
 http://matplotlib.svn.sourceforge.net/matplotlib/?rev=8988&view=rev
Author: weathergod
Date: 2011年02月18日 19:28:29 +0000 (2011年2月18日)
Log Message:
-----------
scatter() now supports empty arrays as input (i.e., scatter([], []) is now valid).
Enabling this required fixes in both scatter() and in collections.py.
Collections now support empty versions of themselves more correctly due to these fixes.
Modified Paths:
--------------
 trunk/matplotlib/CHANGELOG
 trunk/matplotlib/lib/matplotlib/axes.py
 trunk/matplotlib/lib/matplotlib/collections.py
Modified: trunk/matplotlib/CHANGELOG
===================================================================
--- trunk/matplotlib/CHANGELOG	2011年02月17日 16:44:28 UTC (rev 8987)
+++ trunk/matplotlib/CHANGELOG	2011年02月18日 19:28:29 UTC (rev 8988)
@@ -1,3 +1,6 @@
+2011年02月18日 scatter([], []) is now valid. Also fixed issues
+ with empty collections - BVR
+
 2011年02月07日 Quick workaround for dviread bug #3175113 - JKS
 
 2011年02月05日 Add cbook memory monitoring for Windows, using
Modified: trunk/matplotlib/lib/matplotlib/axes.py
===================================================================
--- trunk/matplotlib/lib/matplotlib/axes.py	2011年02月17日 16:44:28 UTC (rev 8987)
+++ trunk/matplotlib/lib/matplotlib/axes.py	2011年02月18日 19:28:29 UTC (rev 8988)
@@ -5832,24 +5832,17 @@
 else:
 collection.autoscale_None()
 
- temp_x = x
- temp_y = y
-
- minx = np.amin(temp_x)
- maxx = np.amax(temp_x)
- miny = np.amin(temp_y)
- maxy = np.amax(temp_y)
-
- w = maxx-minx
- h = maxy-miny
-
 # the pad is a little hack to deal with the fact that we don't
 # want to transform all the symbols whose scales are in points
 # to data coords to get the exact bounding box for efficiency
 # reasons. It can be done right if this is deemed important
- padx, pady = 0.05*w, 0.05*h
- corners = (minx-padx, miny-pady), (maxx+padx, maxy+pady)
- self.update_datalim( corners)
+ # Also, only bother with this padding if there is anything to draw.
+ if self._xmargin < 0.05 and x.size > 0 :
+ self.set_xmargin(0.05)
+
+ if self._ymargin < 0.05 and x.size > 0 :
+ self.set_ymargin(0.05)
+
 self.autoscale_view()
 
 # add the collection last
Modified: trunk/matplotlib/lib/matplotlib/collections.py
===================================================================
--- trunk/matplotlib/lib/matplotlib/collections.py	2011年02月17日 16:44:28 UTC (rev 8987)
+++ trunk/matplotlib/lib/matplotlib/collections.py	2011年02月18日 19:28:29 UTC (rev 8988)
@@ -59,6 +59,8 @@
 scalar mappable will be made to set the face colors.
 """
 _offsets = np.array([], np.float_)
+ # _offsets must be a Nx2 array!
+ _offsets.shape = (0, 2)
 _transOffset = transforms.IdentityTransform()
 _transforms = []
 
@@ -95,10 +97,11 @@
 
 self._uniform_offsets = None
 self._offsets = np.array([], np.float_)
+ # Force _offsets to be Nx2
+ self._offsets.shape = (0, 2)
 if offsets is not None:
 offsets = np.asarray(offsets)
- if len(offsets.shape) == 1:
- offsets = offsets[np.newaxis,:] # Make it Nx2.
+ offsets.shape = (-1, 2) # Make it Nx2
 if transOffset is not None:
 self._offsets = offsets
 self._transOffset = transOffset
@@ -148,13 +151,17 @@
 transOffset = self._transOffset
 offsets = self._offsets
 paths = self.get_paths()
+
+
 if not transform.is_affine:
 paths = [transform.transform_path_non_affine(p) for p in paths]
 transform = transform.get_affine()
 if not transOffset.is_affine:
 offsets = transOffset.transform_non_affine(offsets)
 transOffset = transOffset.get_affine()
+
 offsets = np.asarray(offsets, np.float_)
+ offsets.shape = (-1, 2) # Make it Nx2
 
 result = mpath.get_path_collection_extents(
 transform.frozen(), paths, self.get_transforms(),
@@ -176,6 +183,7 @@
 offsets = self._offsets
 paths = self.get_paths()
 
+
 if self.have_units():
 paths = []
 for path in self.get_paths():
@@ -184,17 +192,19 @@
 xs = self.convert_xunits(xs)
 ys = self.convert_yunits(ys)
 paths.append(mpath.Path(zip(xs, ys), path.codes))
- if len(self._offsets):
- xs = self.convert_xunits(self._offsets[:,0])
- ys = self.convert_yunits(self._offsets[:,1])
+
+ if offsets.size > 0:
+ xs = self.convert_xunits(offsets[:,0])
+ ys = self.convert_yunits(offsets[:,1])
 offsets = zip(xs, ys)
 
 offsets = np.asarray(offsets, np.float_)
+ offsets.shape = (-1, 2) # Make it Nx2
 
 if not transform.is_affine:
 paths = [transform.transform_path_non_affine(path) for path in paths]
 transform = transform.get_affine()
- if not transOffset.is_affine:
+ if not transOffset.is_affine :
 offsets = transOffset.transform_non_affine(offsets)
 transOffset = transOffset.get_affine()
 
@@ -258,8 +268,7 @@
 ACCEPTS: float or sequence of floats
 """
 offsets = np.asarray(offsets, np.float_)
- if len(offsets.shape) == 1:
- offsets = offsets[np.newaxis,:] # Make it Nx2.
+ offsets.shape = (-1, 2) # Make it Nx2
 #This decision is based on how they are initialized above
 if self._uniform_offsets is None:
 self._offsets = offsets
@@ -1221,6 +1230,7 @@
 offsets = zip(xs, ys)
 
 offsets = np.asarray(offsets, np.float_)
+ offsets.shape = (-1, 2) # Make it Nx2
 
 self.update_scalarmappable()
 
This was sent by the SourceForge.net collaborative development platform, the world's largest Open Source development site.
From: <js...@us...> - 2011年02月17日 16:44:34
Revision: 8987
 http://matplotlib.svn.sourceforge.net/matplotlib/?rev=8987&view=rev
Author: jswhit
Date: 2011年02月17日 16:44:28 +0000 (2011年2月17日)
Log Message:
-----------
update
Modified Paths:
--------------
 trunk/toolkits/basemap/Changelog
Modified: trunk/toolkits/basemap/Changelog
===================================================================
--- trunk/toolkits/basemap/Changelog	2011年02月17日 15:51:17 UTC (rev 8986)
+++ trunk/toolkits/basemap/Changelog	2011年02月17日 16:44:28 UTC (rev 8987)
@@ -2,6 +2,7 @@
 * added lic_demo.py to examples (line integral convolution,
 requires scikit.vectorplot).
 * removed deprecated NetCDFFile.
+ * added zorder keyword to drawmapscale.
 version 1.0.1 (svn revision 8967)
 * regenerated C source with cython 0.14.1.
 * added new "allsky" example from Tom Loredo.
This was sent by the SourceForge.net collaborative development platform, the world's largest Open Source development site.
From: <md...@us...> - 2011年02月17日 15:51:23
Revision: 8986
 http://matplotlib.svn.sourceforge.net/matplotlib/?rev=8986&view=rev
Author: mdboom
Date: 2011年02月17日 15:51:17 +0000 (2011年2月17日)
Log Message:
-----------
Fix long Unicode characters in SVG backend (fixes mathtext_stixsans test on narrow Python builds.)
Modified Paths:
--------------
 trunk/matplotlib/lib/matplotlib/backends/backend_svg.py
 trunk/matplotlib/lib/matplotlib/mathtext.py
Modified: trunk/matplotlib/lib/matplotlib/backends/backend_svg.py
===================================================================
--- trunk/matplotlib/lib/matplotlib/backends/backend_svg.py	2011年02月17日 15:50:47 UTC (rev 8985)
+++ trunk/matplotlib/lib/matplotlib/backends/backend_svg.py	2011年02月17日 15:51:17 UTC (rev 8986)
@@ -603,7 +603,7 @@
 if rcParams['svg.embed_char_paths']:
 new_chars = []
 for c in s:
- path = self._add_char_def(prop, c)
+ path = self._add_char_def(prop, ord(c))
 if path is not None:
 new_chars.append(path)
 if len(new_chars):
@@ -628,7 +628,7 @@
 lastgind = None
 currx = 0
 for c in s:
- charnum = self._get_char_def_id(prop, c)
+ charnum = self._get_char_def_id(prop, ord(c))
 ccode = ord(c)
 gind = cmap.get(ccode)
 if gind is None:
@@ -680,13 +680,13 @@
 font = prop
 font.set_size(self.FONT_SCALE, 72)
 ps_name = font.get_sfnt()[(1,0,0,6)]
- char_id = urllib.quote('%s-%d' % (ps_name, ord(char)))
+ char_id = urllib.quote('%s-%d' % (ps_name, char))
 char_num = self._char_defs.get(char_id, None)
 if char_num is not None:
 return None
 
 path_data = []
- glyph = font.load_char(ord(char), flags=LOAD_NO_HINTING)
+ glyph = font.load_char(char, flags=LOAD_NO_HINTING)
 currx, curry = 0.0, 0.0
 for step in glyph.path:
 if step[0] == 0: # MOVE_TO
@@ -724,7 +724,7 @@
 font = prop
 font.set_size(self.FONT_SCALE, 72)
 ps_name = font.get_sfnt()[(1,0,0,6)]
- char_id = urllib.quote('%s-%d' % (ps_name, ord(char)))
+ char_id = urllib.quote('%s-%d' % (ps_name, char))
 return self._char_defs[char_id]
 
 def _draw_mathtext(self, gc, x, y, s, prop, angle):
@@ -742,8 +742,8 @@
 
 if rcParams['svg.embed_char_paths']:
 new_chars = []
- for font, fontsize, thetext, new_x, new_y_mtc, metrics in svg_glyphs:
- path = self._add_char_def(font, thetext)
+ for font, fontsize, char, new_x, new_y_mtc, metrics in svg_glyphs:
+ path = self._add_char_def(font, char)
 if path is not None:
 new_chars.append(path)
 if len(new_chars):
@@ -760,8 +760,8 @@
 svg.append('translate(%f,%f)' % (x, y))
 svg.append('">\n')
 
- for font, fontsize, thetext, new_x, new_y_mtc, metrics in svg_glyphs:
- charid = self._get_char_def_id(font, thetext)
+ for font, fontsize, char, new_x, new_y_mtc, metrics in svg_glyphs:
+ charid = self._get_char_def_id(font, char)
 
 svg.append('<use xlink:href="#%s" transform="translate(%f,%f)scale(%f)"/>\n' %
 (charid, new_x, -new_y_mtc, fontsize / self.FONT_SCALE))
Modified: trunk/matplotlib/lib/matplotlib/mathtext.py
===================================================================
--- trunk/matplotlib/lib/matplotlib/mathtext.py	2011年02月17日 15:50:47 UTC (rev 8985)
+++ trunk/matplotlib/lib/matplotlib/mathtext.py	2011年02月17日 15:51:17 UTC (rev 8986)
@@ -329,10 +329,9 @@
 
 def render_glyph(self, ox, oy, info):
 oy = self.height - oy + info.offset
- thetext = unichr_safe(info.num)
 
 self.svg_glyphs.append(
- (info.font, info.fontsize, thetext, ox, oy, info.metrics))
+ (info.font, info.fontsize, info.num, ox, oy, info.metrics))
 
 def render_rect_filled(self, x1, y1, x2, y2):
 self.svg_rects.append(
This was sent by the SourceForge.net collaborative development platform, the world's largest Open Source development site.
Revision: 8985
 http://matplotlib.svn.sourceforge.net/matplotlib/?rev=8985&view=rev
Author: mdboom
Date: 2011年02月17日 15:50:47 +0000 (2011年2月17日)
Log Message:
-----------
Update image_interps.pdf baseline image.
Modified Paths:
--------------
 trunk/matplotlib/lib/matplotlib/tests/baseline_images/test_image/image_interps.pdf
Modified: trunk/matplotlib/lib/matplotlib/tests/baseline_images/test_image/image_interps.pdf
===================================================================
(Binary files differ)
This was sent by the SourceForge.net collaborative development platform, the world's largest Open Source development site.
Revision: 8984
 http://matplotlib.svn.sourceforge.net/matplotlib/?rev=8984&view=rev
Author: mdboom
Date: 2011年02月17日 15:50:10 +0000 (2011年2月17日)
Log Message:
-----------
Adding canonical baseline images
Added Paths:
-----------
 trunk/matplotlib/lib/matplotlib/tests/baseline_images/test_axes/canonical.pdf
 trunk/matplotlib/lib/matplotlib/tests/baseline_images/test_axes/canonical.png
 trunk/matplotlib/lib/matplotlib/tests/baseline_images/test_axes/canonical.svg
Added: trunk/matplotlib/lib/matplotlib/tests/baseline_images/test_axes/canonical.pdf
===================================================================
(Binary files differ)
Property changes on: trunk/matplotlib/lib/matplotlib/tests/baseline_images/test_axes/canonical.pdf
___________________________________________________________________
Added: svn:mime-type
 + application/octet-stream
Added: trunk/matplotlib/lib/matplotlib/tests/baseline_images/test_axes/canonical.png
===================================================================
(Binary files differ)
Property changes on: trunk/matplotlib/lib/matplotlib/tests/baseline_images/test_axes/canonical.png
___________________________________________________________________
Added: svn:mime-type
 + application/octet-stream
Added: trunk/matplotlib/lib/matplotlib/tests/baseline_images/test_axes/canonical.svg
===================================================================
--- trunk/matplotlib/lib/matplotlib/tests/baseline_images/test_axes/canonical.svg	 (rev 0)
+++ trunk/matplotlib/lib/matplotlib/tests/baseline_images/test_axes/canonical.svg	2011年02月17日 15:50:10 UTC (rev 8984)
@@ -0,0 +1,216 @@
+<?xml version="1.0" standalone="no"?>
+<!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN"
+ "http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd">
+<!-- Created with matplotlib (http://matplotlib.sourceforge.net/) -->
+<svg width="576pt" height="432pt" viewBox="0 0 576 432"
+ xmlns="http://www.w3.org/2000/svg"
+ xmlns:xlink="http://www.w3.org/1999/xlink"
+ version="1.1"
+ id="svg1">
+<filter id="colorAdd"><feComposite in="SourceGraphic" in2="BackgroundImage" operator="arithmetic" k2="1" k3="1"/></filter>
+<g id="figure1">
+<g id="patch1">
+<path style="fill: #ffffff; stroke: #ffffff; stroke-width: 1.000000; stroke-linejoin: round; stroke-linecap: square; opacity: 1.000000" d="M0.000000 432.000000L576.000000 432.000000L576.000000 0.000000
+L0.000000 0.000000z"/>
+</g>
+<g id="axes1">
+<g id="patch2">
+<path style="fill: #ffffff; opacity: 1.000000" d="M72.000000 388.800000L518.400000 388.800000L518.400000 43.200000
+L72.000000 43.200000z"/>
+</g>
+<g id="line2d1">
+<defs>
+ <clipPath id="p50431ccdcb28178602d99d9270004dde">
+<rect x="72.000000" y="43.200000" width="446.400000" height="345.600000"/>
+ </clipPath>
+</defs><path style="fill: none; stroke: #0000ff; stroke-width: 1.000000; stroke-linejoin: round; stroke-linecap: square; opacity: 1.000000" clip-path="url(#p50431ccdcb28178602d99d9270004dde)" d="M72.000000 388.800000L295.200000 216.000000L518.400000 43.200000"/>
+</g>
+<g id="matplotlib.axis1">
+<g id="xtick1">
+<g id="line2d2">
+<defs><path id="m30e32995789d870ad79a2e54c91cf9c6" d="M0.000000 0.000000L0.000000 -4.000000"/></defs>
+<g ><use style="fill: none; stroke: #000000; stroke-width: 0.500000; stroke-linejoin: round; stroke-linecap: butt; opacity: 1.000000" xlink:href="#m30e32995789d870ad79a2e54c91cf9c6" x="72.000000" y="388.800000"/>
+</g></g>
+<g id="line2d3">
+<defs><path id="m9281cae24120827b11d5ea8a7ad3e96b" d="M0.000000 0.000000L0.000000 4.000000"/></defs>
+<g ><use style="fill: none; stroke: #000000; stroke-width: 0.500000; stroke-linejoin: round; stroke-linecap: butt; opacity: 1.000000" xlink:href="#m9281cae24120827b11d5ea8a7ad3e96b" x="72.000000" y="43.200000"/>
+</g></g>
+<g id="text1">
+<defs>
+<path id="c_7a2040fe3b94fcd41d0a72c84e93b115" d="M31.781250 -66.406250q-7.609375 0.000000 -11.453125 7.500000q-3.828125 7.484375 -3.828125 22.531250q0.000000 14.984375 3.828125 22.484375q3.843750 7.500000 11.453125 7.500000q7.671875 0.000000 11.500000 -7.500000q3.843750 -7.500000 3.843750 -22.484375q0.000000 -15.046875 -3.843750 -22.531250q-3.828125 -7.500000 -11.500000 -7.500000M31.781250 -74.218750q12.265625 0.000000 18.734375 9.703125q6.468750 9.687500 6.468750 28.140625q0.000000 18.406250 -6.468750 28.109375q-6.468750 9.687500 -18.734375 9.687500q-12.250000 0.000000 -18.718750 -9.687500q-6.468750 -9.703125 -6.468750 -28.109375q0.000000 -18.453125 6.468750 -28.140625q6.468750 -9.703125 18.718750 -9.703125"/>
+<path id="c_ed3e21196fb739f392806f09ca0594ef" d="M10.687500 -12.406250l10.312500 0.000000l0.000000 12.406250l-10.312500 0.000000z"/>
+</defs>
+<g style="fill: #000000; opacity: 1.000000" transform="translate(63.250000,401.706250)scale(0.120000)">
+<use xlink:href="#c_7a2040fe3b94fcd41d0a72c84e93b115"/>
+<use xlink:href="#c_ed3e21196fb739f392806f09ca0594ef" x="63.623047"/>
+<use xlink:href="#c_7a2040fe3b94fcd41d0a72c84e93b115" x="95.410156"/>
+</g>
+</g>
+</g>
+<g id="xtick2">
+<g id="line2d4">
+<g ><use style="fill: none; stroke: #000000; stroke-width: 0.500000; stroke-linejoin: round; stroke-linecap: butt; opacity: 1.000000" xlink:href="#m30e32995789d870ad79a2e54c91cf9c6" x="183.600000" y="388.800000"/>
+</g></g>
+<g id="line2d5">
+<g ><use style="fill: none; stroke: #000000; stroke-width: 0.500000; stroke-linejoin: round; stroke-linecap: butt; opacity: 1.000000" xlink:href="#m9281cae24120827b11d5ea8a7ad3e96b" x="183.600000" y="43.200000"/>
+</g></g>
+<g id="text2">
+<defs>
+<path id="c_1260a2df50f305f3db244e29828f968e" d="M10.796875 -72.906250l38.718750 0.000000l0.000000 8.312500l-29.687500 0.000000l0.000000 17.859375q2.140625 -0.734375 4.281250 -1.093750q2.156250 -0.359375 4.312500 -0.359375q12.203125 0.000000 19.328125 6.687500q7.140625 6.687500 7.140625 18.109375q0.000000 11.765625 -7.328125 18.296875q-7.328125 6.515625 -20.656250 6.515625q-4.593750 0.000000 -9.359375 -0.781250q-4.750000 -0.781250 -9.828125 -2.343750l0.000000 -9.921875q4.390625 2.390625 9.078125 3.562500q4.687500 1.171875 9.906250 1.171875q8.453125 0.000000 13.375000 -4.437500q4.937500 -4.437500 4.937500 -12.062500q0.000000 -7.609375 -4.937500 -12.046875q-4.921875 -4.453125 -13.375000 -4.453125q-3.953125 0.000000 -7.890625 0.875000q-3.921875 0.875000 -8.015625 2.734375z"/>
+</defs>
+<g style="fill: #000000; opacity: 1.000000" transform="translate(174.975000,401.706250)scale(0.120000)">
+<use xlink:href="#c_7a2040fe3b94fcd41d0a72c84e93b115"/>
+<use xlink:href="#c_ed3e21196fb739f392806f09ca0594ef" x="63.623047"/>
+<use xlink:href="#c_1260a2df50f305f3db244e29828f968e" x="95.410156"/>
+</g>
+</g>
+</g>
+<g id="xtick3">
+<g id="line2d6">
+<g ><use style="fill: none; stroke: #000000; stroke-width: 0.500000; stroke-linejoin: round; stroke-linecap: butt; opacity: 1.000000" xlink:href="#m30e32995789d870ad79a2e54c91cf9c6" x="295.200000" y="388.800000"/>
+</g></g>
+<g id="line2d7">
+<g ><use style="fill: none; stroke: #000000; stroke-width: 0.500000; stroke-linejoin: round; stroke-linecap: butt; opacity: 1.000000" xlink:href="#m9281cae24120827b11d5ea8a7ad3e96b" x="295.200000" y="43.200000"/>
+</g></g>
+<g id="text3">
+<defs>
+<path id="c_42baa63129a918535c52adb20d687ea7" d="M12.406250 -8.296875l16.109375 0.000000l0.000000 -55.625000l-17.531250 3.515625l0.000000 -8.984375l17.437500 -3.515625l9.859375 0.000000l0.000000 64.609375l16.109375 0.000000l0.000000 8.296875l-41.984375 0.000000z"/>
+</defs>
+<g style="fill: #000000; opacity: 1.000000" transform="translate(286.707812,401.706250)scale(0.120000)">
+<use xlink:href="#c_42baa63129a918535c52adb20d687ea7"/>
+<use xlink:href="#c_ed3e21196fb739f392806f09ca0594ef" x="63.623047"/>
+<use xlink:href="#c_7a2040fe3b94fcd41d0a72c84e93b115" x="95.410156"/>
+</g>
+</g>
+</g>
+<g id="xtick4">
+<g id="line2d8">
+<g ><use style="fill: none; stroke: #000000; stroke-width: 0.500000; stroke-linejoin: round; stroke-linecap: butt; opacity: 1.000000" xlink:href="#m30e32995789d870ad79a2e54c91cf9c6" x="406.800000" y="388.800000"/>
+</g></g>
+<g id="line2d9">
+<g ><use style="fill: none; stroke: #000000; stroke-width: 0.500000; stroke-linejoin: round; stroke-linecap: butt; opacity: 1.000000" xlink:href="#m9281cae24120827b11d5ea8a7ad3e96b" x="406.800000" y="43.200000"/>
+</g></g>
+<g id="text4">
+<g style="fill: #000000; opacity: 1.000000" transform="translate(398.432812,401.550000)scale(0.120000)">
+<use xlink:href="#c_42baa63129a918535c52adb20d687ea7"/>
+<use xlink:href="#c_ed3e21196fb739f392806f09ca0594ef" x="63.623047"/>
+<use xlink:href="#c_1260a2df50f305f3db244e29828f968e" x="95.410156"/>
+</g>
+</g>
+</g>
+<g id="xtick5">
+<g id="line2d10">
+<g ><use style="fill: none; stroke: #000000; stroke-width: 0.500000; stroke-linejoin: round; stroke-linecap: butt; opacity: 1.000000" xlink:href="#m30e32995789d870ad79a2e54c91cf9c6" x="518.400000" y="388.800000"/>
+</g></g>
+<g id="line2d11">
+<g ><use style="fill: none; stroke: #000000; stroke-width: 0.500000; stroke-linejoin: round; stroke-linecap: butt; opacity: 1.000000" xlink:href="#m9281cae24120827b11d5ea8a7ad3e96b" x="518.400000" y="43.200000"/>
+</g></g>
+<g id="text5">
+<defs>
+<path id="c_ed3f3ed3ebfbd18bcb9c012009a68ad1" d="M19.187500 -8.296875l34.421875 0.000000l0.000000 8.296875l-46.281250 0.000000l0.000000 -8.296875q5.609375 -5.812500 15.296875 -15.593750q9.703125 -9.796875 12.187500 -12.640625q4.734375 -5.312500 6.609375 -9.000000q1.890625 -3.687500 1.890625 -7.250000q0.000000 -5.812500 -4.078125 -9.468750q-4.078125 -3.671875 -10.625000 -3.671875q-4.640625 0.000000 -9.796875 1.609375q-5.140625 1.609375 -11.000000 4.890625l0.000000 -9.968750q5.953125 -2.390625 11.125000 -3.609375q5.187500 -1.218750 9.484375 -1.218750q11.328125 0.000000 18.062500 5.671875q6.734375 5.656250 6.734375 15.125000q0.000000 4.500000 -1.687500 8.531250q-1.671875 4.015625 -6.125000 9.484375q-1.218750 1.421875 -7.765625 8.187500q-6.531250 6.765625 -18.453125 18.921875"/>
+</defs>
+<g style="fill: #000000; opacity: 1.000000" transform="translate(509.689062,401.706250)scale(0.120000)">
+<use xlink:href="#c_ed3f3ed3ebfbd18bcb9c012009a68ad1"/>
+<use xlink:href="#c_ed3e21196fb739f392806f09ca0594ef" x="63.623047"/>
+<use xlink:href="#c_7a2040fe3b94fcd41d0a72c84e93b115" x="95.410156"/>
+</g>
+</g>
+</g>
+</g>
+<g id="matplotlib.axis2">
+<g id="ytick1">
+<g id="line2d12">
+<defs><path id="m3400efa6b1638b3fea9e19e898273957" d="M0.000000 0.000000L4.000000 0.000000"/></defs>
+<g ><use style="fill: none; stroke: #000000; stroke-width: 0.500000; stroke-linejoin: round; stroke-linecap: butt; opacity: 1.000000" xlink:href="#m3400efa6b1638b3fea9e19e898273957" x="72.000000" y="388.800000"/>
+</g></g>
+<g id="line2d13">
+<defs><path id="m20b58b2501143cb5e0a5e8f1ef6f1643" d="M0.000000 0.000000L-4.000000 0.000000"/></defs>
+<g ><use style="fill: none; stroke: #000000; stroke-width: 0.500000; stroke-linejoin: round; stroke-linecap: butt; opacity: 1.000000" xlink:href="#m20b58b2501143cb5e0a5e8f1ef6f1643" x="518.400000" y="388.800000"/>
+</g></g>
+<g id="text6">
+<g style="fill: #000000; opacity: 1.000000" transform="translate(51.015625,393.167188)scale(0.120000)">
+<use xlink:href="#c_42baa63129a918535c52adb20d687ea7"/>
+<use xlink:href="#c_ed3e21196fb739f392806f09ca0594ef" x="63.623047"/>
+<use xlink:href="#c_7a2040fe3b94fcd41d0a72c84e93b115" x="95.410156"/>
+</g>
+</g>
+</g>
+<g id="ytick2">
+<g id="line2d14">
+<g ><use style="fill: none; stroke: #000000; stroke-width: 0.500000; stroke-linejoin: round; stroke-linecap: butt; opacity: 1.000000" xlink:href="#m3400efa6b1638b3fea9e19e898273957" x="72.000000" y="302.400000"/>
+</g></g>
+<g id="line2d15">
+<g ><use style="fill: none; stroke: #000000; stroke-width: 0.500000; stroke-linejoin: round; stroke-linecap: butt; opacity: 1.000000" xlink:href="#m20b58b2501143cb5e0a5e8f1ef6f1643" x="518.400000" y="302.400000"/>
+</g></g>
+<g id="text7">
+<g style="fill: #000000; opacity: 1.000000" transform="translate(51.265625,306.689062)scale(0.120000)">
+<use xlink:href="#c_42baa63129a918535c52adb20d687ea7"/>
+<use xlink:href="#c_ed3e21196fb739f392806f09ca0594ef" x="63.623047"/>
+<use xlink:href="#c_1260a2df50f305f3db244e29828f968e" x="95.410156"/>
+</g>
+</g>
+</g>
+<g id="ytick3">
+<g id="line2d16">
+<g ><use style="fill: none; stroke: #000000; stroke-width: 0.500000; stroke-linejoin: round; stroke-linecap: butt; opacity: 1.000000" xlink:href="#m3400efa6b1638b3fea9e19e898273957" x="72.000000" y="216.000000"/>
+</g></g>
+<g id="line2d17">
+<g ><use style="fill: none; stroke: #000000; stroke-width: 0.500000; stroke-linejoin: round; stroke-linecap: butt; opacity: 1.000000" xlink:href="#m20b58b2501143cb5e0a5e8f1ef6f1643" x="518.400000" y="216.000000"/>
+</g></g>
+<g id="text8">
+<g style="fill: #000000; opacity: 1.000000" transform="translate(50.578125,220.367188)scale(0.120000)">
+<use xlink:href="#c_ed3f3ed3ebfbd18bcb9c012009a68ad1"/>
+<use xlink:href="#c_ed3e21196fb739f392806f09ca0594ef" x="63.623047"/>
+<use xlink:href="#c_7a2040fe3b94fcd41d0a72c84e93b115" x="95.410156"/>
+</g>
+</g>
+</g>
+<g id="ytick4">
+<g id="line2d18">
+<g ><use style="fill: none; stroke: #000000; stroke-width: 0.500000; stroke-linejoin: round; stroke-linecap: butt; opacity: 1.000000" xlink:href="#m3400efa6b1638b3fea9e19e898273957" x="72.000000" y="129.600000"/>
+</g></g>
+<g id="line2d19">
+<g ><use style="fill: none; stroke: #000000; stroke-width: 0.500000; stroke-linejoin: round; stroke-linecap: butt; opacity: 1.000000" xlink:href="#m20b58b2501143cb5e0a5e8f1ef6f1643" x="518.400000" y="129.600000"/>
+</g></g>
+<g id="text9">
+<g style="fill: #000000; opacity: 1.000000" transform="translate(50.828125,133.967188)scale(0.120000)">
+<use xlink:href="#c_ed3f3ed3ebfbd18bcb9c012009a68ad1"/>
+<use xlink:href="#c_ed3e21196fb739f392806f09ca0594ef" x="63.623047"/>
+<use xlink:href="#c_1260a2df50f305f3db244e29828f968e" x="95.410156"/>
+</g>
+</g>
+</g>
+<g id="ytick5">
+<g id="line2d20">
+<g ><use style="fill: none; stroke: #000000; stroke-width: 0.500000; stroke-linejoin: round; stroke-linecap: butt; opacity: 1.000000" xlink:href="#m3400efa6b1638b3fea9e19e898273957" x="72.000000" y="43.200000"/>
+</g></g>
+<g id="line2d21">
+<g ><use style="fill: none; stroke: #000000; stroke-width: 0.500000; stroke-linejoin: round; stroke-linecap: butt; opacity: 1.000000" xlink:href="#m20b58b2501143cb5e0a5e8f1ef6f1643" x="518.400000" y="43.200000"/>
+</g></g>
+<g id="text10">
+<defs>
+<path id="c_3dcfa38a02242cb63ec6726c6e70be7a" d="M40.578125 -39.312500q7.078125 1.515625 11.046875 6.312500q3.984375 4.781250 3.984375 11.812500q0.000000 10.781250 -7.421875 16.703125q-7.421875 5.906250 -21.093750 5.906250q-4.578125 0.000000 -9.437500 -0.906250q-4.859375 -0.906250 -10.031250 -2.718750l0.000000 -9.515625q4.093750 2.390625 8.968750 3.609375q4.890625 1.218750 10.218750 1.218750q9.265625 0.000000 14.125000 -3.656250q4.859375 -3.656250 4.859375 -10.640625q0.000000 -6.453125 -4.515625 -10.078125q-4.515625 -3.640625 -12.562500 -3.640625l-8.500000 0.000000l0.000000 -8.109375l8.890625 0.000000q7.265625 0.000000 11.125000 -2.906250q3.859375 -2.906250 3.859375 -8.375000q0.000000 -5.609375 -3.984375 -8.609375q-3.968750 -3.015625 -11.390625 -3.015625q-4.062500 0.000000 -8.703125 0.890625q-4.640625 0.875000 -10.203125 2.718750l0.000000 -8.781250q5.625000 -1.562500 10.531250 -2.343750q4.906250 -0.781250 9.250000 -0.781250q11.234375 0.000000 17.765625 5.109375q6.546875 5.093750 6.546875 13.781250q0.000000 6.062500 -3.468750 10.234375q-3.468750 4.171875 -9.859375 5.781250"/>
+</defs>
+<g style="fill: #000000; opacity: 1.000000" transform="translate(50.625000,47.567187)scale(0.120000)">
+<use xlink:href="#c_3dcfa38a02242cb63ec6726c6e70be7a"/>
+<use xlink:href="#c_ed3e21196fb739f392806f09ca0594ef" x="63.623047"/>
+<use xlink:href="#c_7a2040fe3b94fcd41d0a72c84e93b115" x="95.410156"/>
+</g>
+</g>
+</g>
+</g>
+<g id="patch3">
+<path style="fill: none; stroke: #000000; stroke-width: 1.000000; stroke-linejoin: round; stroke-linecap: square; opacity: 1.000000" d="M72.000000 43.200000L518.400000 43.200000"/>
+</g>
+<g id="patch4">
+<path style="fill: none; stroke: #000000; stroke-width: 1.000000; stroke-linejoin: round; stroke-linecap: square; opacity: 1.000000" d="M518.400000 388.800000L518.400000 43.200000"/>
+</g>
+<g id="patch5">
+<path style="fill: none; stroke: #000000; stroke-width: 1.000000; stroke-linejoin: round; stroke-linecap: square; opacity: 1.000000" d="M72.000000 388.800000L518.400000 388.800000"/>
+</g>
+<g id="patch6">
+<path style="fill: none; stroke: #000000; stroke-width: 1.000000; stroke-linejoin: round; stroke-linecap: square; opacity: 1.000000" d="M72.000000 388.800000L72.000000 43.200000"/>
+</g>
+</g>
+</g>
+</svg>
This was sent by the SourceForge.net collaborative development platform, the world's largest Open Source development site.
Revision: 8983
 http://matplotlib.svn.sourceforge.net/matplotlib/?rev=8983&view=rev
Author: jswhit
Date: 2011年02月17日 04:33:34 +0000 (2011年2月17日)
Log Message:
-----------
include zorder keyword for drawmapscale.
Modified Paths:
--------------
 trunk/toolkits/basemap/lib/mpl_toolkits/basemap/__init__.py
Modified: trunk/toolkits/basemap/lib/mpl_toolkits/basemap/__init__.py
===================================================================
--- trunk/toolkits/basemap/lib/mpl_toolkits/basemap/__init__.py	2011年02月12日 16:10:15 UTC (rev 8982)
+++ trunk/toolkits/basemap/lib/mpl_toolkits/basemap/__init__.py	2011年02月17日 04:33:34 UTC (rev 8983)
@@ -3454,7 +3454,7 @@
 def drawmapscale(self,lon,lat,lon0,lat0,length,barstyle='simple',\
 units='km',fontsize=9,yoffset=None,labelstyle='simple',\
 fontcolor='k',fillcolor1='w',fillcolor2='k',ax=None,\
- format='%d'):
+ format='%d',zorder=None):
 """
 Draw a map scale at ``lon,lat`` of length ``length``
 representing distance in the map
@@ -3487,6 +3487,7 @@
 fillcolor1(2) colors of the alternating filled regions
 (default white and black). Only relevant for
 'fancy' barstyle.
+ zorder sets the zorder for the map scale.
 ============== ====================================================
 
 Extra keyword ``ax`` can be used to override the default axis instance.
@@ -3625,6 +3626,8 @@
 fontsize=fontsize,color=fontcolor))
 else:
 raise KeyError("barstyle must be 'simple' or 'fancy'")
+ if zorder is not None:
+ rets = [ret.set_zorder(zorder) for ret in rets]
 return rets
 
 def nightshade(self,date,color="k",delta=0.25,alpha=0.5,ax=None,zorder=2):
This was sent by the SourceForge.net collaborative development platform, the world's largest Open Source development site.
From: <js...@us...> - 2011年02月12日 16:10:21
Revision: 8982
 http://matplotlib.svn.sourceforge.net/matplotlib/?rev=8982&view=rev
Author: jswhit
Date: 2011年02月12日 16:10:15 +0000 (2011年2月12日)
Log Message:
-----------
add comments.
Modified Paths:
--------------
 trunk/toolkits/basemap/examples/lic_demo.py
Modified: trunk/toolkits/basemap/examples/lic_demo.py
===================================================================
--- trunk/toolkits/basemap/examples/lic_demo.py	2011年02月12日 16:08:44 UTC (rev 8981)
+++ trunk/toolkits/basemap/examples/lic_demo.py	2011年02月12日 16:10:15 UTC (rev 8982)
@@ -8,7 +8,7 @@
 try:
 from scikits.vectorplot import lic_internal
 except ImportError:
- raise ImportError('need vectorplot scikit for this example')
+ raise ImportError('need vectorplot scikit for this example')
 
 # H*wind data from http://www.aoml.noaa.gov/hrd/data_sub/wind.html
 ncfile = NetCDFFile('rita.nc')
@@ -20,7 +20,7 @@
 lons, lats = np.meshgrid(lons1,lats1)
 ncfile.close()
 
-# downsample to finer grid.
+# downsample to finer grid for nicer looking plot.
 nlats = 2*udat.shape[0]; nlons = 2*udat.shape[1]
 lons = np.linspace(lons1[0],lons1[-1],nlons)
 lats = np.linspace(lats1[0],lats1[-1],nlats)
@@ -31,11 +31,13 @@
 
 fig = plt.figure(figsize=(8,8))
 m = Basemap(projection='cyl',llcrnrlat=lats1[0],llcrnrlon=lons1[0],urcrnrlat=lats1[-1],urcrnrlon=lons1[-1],resolution='i')
+# pass texture, kernel and data to LIC function from vectorplot.
 kernellen=31
 texture = np.random.rand(udat.shape[0],udat.shape[1]).astype(np.float32)
 kernel = np.sin(np.arange(kernellen)*np.pi/kernellen).astype(np.float32)
 image = lic_internal.line_integral_convolution(udat.astype(np.float32),\
 vdat.astype(np.float32), texture, kernel)
+# plot the resulting image.
 im = m.imshow(image,plt.cm.gist_stern)
 m.drawcoastlines()
 m.drawmeridians(np.arange(-120,-60,2),labels=[0,0,0,1])
This was sent by the SourceForge.net collaborative development platform, the world's largest Open Source development site.
From: <js...@us...> - 2011年02月12日 16:08:50
Revision: 8981
 http://matplotlib.svn.sourceforge.net/matplotlib/?rev=8981&view=rev
Author: jswhit
Date: 2011年02月12日 16:08:44 +0000 (2011年2月12日)
Log Message:
-----------
interpolate to finer grid for nicer looking plot
Modified Paths:
--------------
 trunk/toolkits/basemap/examples/lic_demo.py
Modified: trunk/toolkits/basemap/examples/lic_demo.py
===================================================================
--- trunk/toolkits/basemap/examples/lic_demo.py	2011年02月12日 16:01:55 UTC (rev 8980)
+++ trunk/toolkits/basemap/examples/lic_demo.py	2011年02月12日 16:08:44 UTC (rev 8981)
@@ -2,7 +2,7 @@
 # flow field (from Hurricane Earl). Produces something akin to streamlines.
 # Requires vectorplot scikit (http://scikits.appspot.com/vectorplot).
 from netCDF4 import Dataset as NetCDFFile
-from mpl_toolkits.basemap import Basemap
+from mpl_toolkits.basemap import Basemap, interp
 import numpy as np
 import matplotlib.pyplot as plt
 try:
@@ -20,7 +20,15 @@
 lons, lats = np.meshgrid(lons1,lats1)
 ncfile.close()
 
+# downsample to finer grid.
+nlats = 2*udat.shape[0]; nlons = 2*udat.shape[1]
+lons = np.linspace(lons1[0],lons1[-1],nlons)
+lats = np.linspace(lats1[0],lats1[-1],nlats)
+lons, lats = np.meshgrid(lons, lats)
+udat = interp(udat,lons1,lats1,lons,lats,order=3)
+vdat = interp(vdat,lons1,lats1,lons,lats,order=3)
 
+
 fig = plt.figure(figsize=(8,8))
 m = Basemap(projection='cyl',llcrnrlat=lats1[0],llcrnrlon=lons1[0],urcrnrlat=lats1[-1],urcrnrlon=lons1[-1],resolution='i')
 kernellen=31
This was sent by the SourceForge.net collaborative development platform, the world's largest Open Source development site.
Revision: 8980
 http://matplotlib.svn.sourceforge.net/matplotlib/?rev=8980&view=rev
Author: jswhit
Date: 2011年02月12日 16:01:55 +0000 (2011年2月12日)
Log Message:
-----------
obsolete file.
Removed Paths:
-------------
 trunk/toolkits/basemap/lib/mpl_toolkits/basemap/pupynere.py
Deleted: trunk/toolkits/basemap/lib/mpl_toolkits/basemap/pupynere.py
===================================================================
--- trunk/toolkits/basemap/lib/mpl_toolkits/basemap/pupynere.py	2011年02月12日 14:01:34 UTC (rev 8979)
+++ trunk/toolkits/basemap/lib/mpl_toolkits/basemap/pupynere.py	2011年02月12日 16:01:55 UTC (rev 8980)
@@ -1,688 +0,0 @@
-"""
-NetCDF reader/writer module.
-
-This module implements the Scientific.IO.NetCDF API to read and create
-NetCDF files. The same API is also used in the PyNIO and pynetcdf
-modules, allowing these modules to be used interchangebly when working
-with NetCDF files. The major advantage of ``scipy.io.netcdf`` over other
-modules is that it doesn't require the code to be linked to the NetCDF
-libraries as the other modules do.
-
-The code is based on the `NetCDF file format specification
-<http://www.unidata.ucar.edu/software/netcdf/guide_15.html>`_. A NetCDF
-file is a self-describing binary format, with a header followed by
-data. The header contains metadata describing dimensions, variables
-and the position of the data in the file, so access can be done in an
-efficient manner without loading unnecessary data into memory. We use
-the ``mmap`` module to create Numpy arrays mapped to the data on disk,
-for the same purpose.
-
-The structure of a NetCDF file is as follows:
-
- C D F <VERSION BYTE> <NUMBER OF RECORDS>
- <DIMENSIONS> <GLOBAL ATTRIBUTES> <VARIABLES METADATA>
- <NON-RECORD DATA> <RECORD DATA>
-
-Record data refers to data where the first axis can be expanded at
-will. All record variables share a same dimension at the first axis,
-and they are stored at the end of the file per record, ie
-
- A[0], B[0], ..., A[1], B[1], ..., etc,
-
-so that new data can be appended to the file without changing its original
-structure. Non-record data are padded to a 4n bytes boundary. Record data
-are also padded, unless there is exactly one record variable in the file,
-in which case the padding is dropped. All data is stored in big endian
-byte order.
-
-The Scientific.IO.NetCDF API allows attributes to be added directly to
-instances of ``netcdf_file`` and ``netcdf_variable``. To differentiate
-between user-set attributes and instance attributes, user-set attributes
-are automatically stored in the ``_attributes`` attribute by overloading
-``__setattr__``. This is the reason why the code sometimes uses
-``obj.__dict__['key'] = value``, instead of simply ``obj.key = value``;
-otherwise the key would be inserted into userspace attributes.
-
-To create a NetCDF file::
-
- >>> import time
- >>> f = netcdf_file('simple.nc', 'w')
- >>> f.history = 'Created for a test'
- >>> f.createDimension('time', 10)
- >>> time = f.createVariable('time', 'i', ('time',))
- >>> time[:] = range(10)
- >>> time.units = 'days since 2008年01月01日'
- >>> f.close()
-
-To read the NetCDF file we just created::
-
- >>> f = netcdf_file('simple.nc', 'r')
- >>> print f.history
- Created for a test
- >>> time = f.variables['time']
- >>> print time.units
- days since 2008年01月01日
- >>> print time.shape
- (10,)
- >>> print time[-1]
- 9
- >>> f.close()
-
-TODO: properly implement ``_FillValue``.
-"""
-
-__all__ = ['netcdf_file', 'netcdf_variable']
-
-
-from operator import mul
-from mmap import mmap, ACCESS_READ
-
-from numpy import fromstring, ndarray, dtype, empty, array, asarray, squeeze,\
- zeros, ma
-from numpy import little_endian as LITTLE_ENDIAN
-
-
-ABSENT = '\x00\x00\x00\x00\x00\x00\x00\x00'
-ZERO = '\x00\x00\x00\x00'
-NC_BYTE = '\x00\x00\x00\x01'
-NC_CHAR = '\x00\x00\x00\x02'
-NC_SHORT = '\x00\x00\x00\x03'
-NC_INT = '\x00\x00\x00\x04'
-NC_FLOAT = '\x00\x00\x00\x05'
-NC_DOUBLE = '\x00\x00\x00\x06'
-NC_DIMENSION = '\x00\x00\x00\n'
-NC_VARIABLE = '\x00\x00\x00\x0b'
-NC_ATTRIBUTE = '\x00\x00\x00\x0c'
-
-
-TYPEMAP = { NC_BYTE: ('b', 1),
- NC_CHAR: ('c', 1),
- NC_SHORT: ('h', 2),
- NC_INT: ('i', 4),
- NC_FLOAT: ('f', 4),
- NC_DOUBLE: ('d', 8) }
-
-REVERSE = { 'b': NC_BYTE,
- 'c': NC_CHAR,
- 'h': NC_SHORT,
- 'i': NC_INT,
- 'f': NC_FLOAT,
- 'd': NC_DOUBLE,
-
- # these come from asarray(1).dtype.char and asarray('foo').dtype.char,
- # used when getting the types from generic attributes.
- 'l': NC_INT,
- 'S': NC_CHAR }
-
-
-class netcdf_file(object):
- """
- A ``netcdf_file`` object has two standard attributes: ``dimensions`` and
- ``variables``. The values of both are dictionaries, mapping dimension
- names to their associated lengths and variable names to variables,
- respectively. Application programs should never modify these
- dictionaries.
-
- All other attributes correspond to global attributes defined in the
- NetCDF file. Global file attributes are created by assigning to an
- attribute of the ``netcdf_file`` object.
-
- """
- def __init__(self, filename, mode='r', mmap=True, version=1,\
- maskandscale=False):
- self.filename = filename
- self.use_mmap = mmap
- self.version_byte = version
- self._maskandscale = maskandscale
-
- assert mode in 'rw', "Mode must be either 'r' or 'w'."
- self.mode = mode
-
- self.dimensions = {}
- self.variables = {}
-
- self._dims = []
- self._recs = 0
- self._recsize = 0
-
- self.fp = open(self.filename, '%sb' % mode)
-
- self._attributes = {}
-
- if mode is 'r':
- self._read()
-
- def __setattr__(self, attr, value):
- # Store user defined attributes in a separate dict,
- # so we can save them to file later.
- try:
- self._attributes[attr] = value
- except AttributeError:
- pass
- self.__dict__[attr] = value
-
- def close(self):
- if not self.fp.closed:
- try:
- self.flush()
- finally:
- self.fp.close()
- __del__ = close
-
- def createDimension(self, name, length):
- self.dimensions[name] = length
- self._dims.append(name)
-
- def createVariable(self, name, type, dimensions):
- shape = tuple([self.dimensions[dim] for dim in dimensions])
- shape_ = tuple([dim or 0 for dim in shape]) # replace None with 0 for numpy
-
- if isinstance(type, basestring): type = dtype(type)
- typecode, size = type.char, type.itemsize
- dtype_ = '>%s' % typecode
- if size > 1: dtype_ += str(size)
-
- data = empty(shape_, dtype=dtype_)
- self.variables[name] = netcdf_variable(data, typecode, shape,\
- dimensions, maskandscale=self._maskandscale)
- return self.variables[name]
-
- def flush(self):
- if self.mode is 'w':
- self._write()
- sync = flush
-
- def _write(self):
- self.fp.write('CDF')
- self.fp.write(array(self.version_byte, '>b').tostring())
-
- # Write headers and data.
- self._write_numrecs()
- self._write_dim_array()
- self._write_gatt_array()
- self._write_var_array()
-
- def _write_numrecs(self):
- # Get highest record count from all record variables.
- for var in self.variables.values():
- if var.isrec and len(var.data) > self._recs:
- self.__dict__['_recs'] = len(var.data)
- self._pack_int(self._recs)
-
- def _write_dim_array(self):
- if self.dimensions:
- self.fp.write(NC_DIMENSION)
- self._pack_int(len(self.dimensions))
- for name in self._dims:
- self._pack_string(name)
- length = self.dimensions[name]
- self._pack_int(length or 0) # replace None with 0 for record dimension
- else:
- self.fp.write(ABSENT)
-
- def _write_gatt_array(self):
- self._write_att_array(self._attributes)
-
- def _write_att_array(self, attributes):
- if attributes:
- self.fp.write(NC_ATTRIBUTE)
- self._pack_int(len(attributes))
- for name, values in attributes.items():
- self._pack_string(name)
- self._write_values(values)
- else:
- self.fp.write(ABSENT)
-
- def _write_var_array(self):
- if self.variables:
- self.fp.write(NC_VARIABLE)
- self._pack_int(len(self.variables))
-
- # Sort variables non-recs first, then recs.
- variables = self.variables.items()
- variables.sort(key=lambda (k, v): v._shape and not v.isrec)
- variables.reverse()
- variables = [k for (k, v) in variables]
-
- # Set the metadata for all variables.
- for name in variables:
- self._write_var_metadata(name)
- # Now that we have the metadata, we know the vsize of
- # each record variable, so we can calculate recsize.
- self.__dict__['_recsize'] = sum([
- var._vsize for var in self.variables.values()
- if var.isrec])
- # Set the data for all variables.
- for name in variables:
- self._write_var_data(name)
- else:
- self.fp.write(ABSENT)
-
- def _write_var_metadata(self, name):
- var = self.variables[name]
-
- self._pack_string(name)
- self._pack_int(len(var.dimensions))
- for dimname in var.dimensions:
- dimid = self._dims.index(dimname)
- self._pack_int(dimid)
-
- self._write_att_array(var._attributes)
-
- nc_type = REVERSE[var.typecode()]
- self.fp.write(nc_type)
-
- if not var.isrec:
- vsize = var.data.size * var.data.itemsize
- vsize += -vsize % 4
- else: # record variable
- try:
- vsize = var.data[0].size * var.data.itemsize
- except IndexError:
- vsize = 0
- rec_vars = len([var for var in self.variables.values()
- if var.isrec])
- if rec_vars > 1:
- vsize += -vsize % 4
- self.variables[name].__dict__['_vsize'] = vsize
- self._pack_int(vsize)
-
- # Pack a bogus begin, and set the real value later.
- self.variables[name].__dict__['_begin'] = self.fp.tell()
- self._pack_begin(0)
-
- def _write_var_data(self, name):
- var = self.variables[name]
-
- # Set begin in file header.
- the_beguine = self.fp.tell()
- self.fp.seek(var._begin)
- self._pack_begin(the_beguine)
- self.fp.seek(the_beguine)
-
- # Write data.
- if not var.isrec:
- self.fp.write(var.data.tostring())
- count = var.data.size * var.data.itemsize
- self.fp.write('0' * (var._vsize - count))
- else: # record variable
- # Handle rec vars with shape[0] < nrecs.
- if self._recs > len(var.data):
- shape = (self._recs,) + var.data.shape[1:]
- var.data.resize(shape)
-
- pos0 = pos = self.fp.tell()
- for rec in var.data:
- # Apparently scalars cannot be converted to big endian. If we
- # try to convert a ``=i4`` scalar to, say, '>i4' the dtype
- # will remain as ``=i4``.
- if not rec.shape and (rec.dtype.byteorder == '<' or
- (rec.dtype.byteorder == '=' and LITTLE_ENDIAN)):
- rec = rec.byteswap()
- self.fp.write(rec.tostring())
- # Padding
- count = rec.size * rec.itemsize
- self.fp.write('0' * (var._vsize - count))
- pos += self._recsize
- self.fp.seek(pos)
- self.fp.seek(pos0 + var._vsize)
-
- def _write_values(self, values):
- values = asarray(values)
- values = values.astype(values.dtype.newbyteorder('>'))
-
- nc_type = REVERSE[values.dtype.char]
- self.fp.write(nc_type)
-
- if values.dtype.char == 'S':
- nelems = values.itemsize
- else:
- nelems = values.size
- self._pack_int(nelems)
-
- if not values.shape and (values.dtype.byteorder == '<' or
- (values.dtype.byteorder == '=' and LITTLE_ENDIAN)):
- values = values.byteswap()
- self.fp.write(values.tostring())
- count = values.size * values.itemsize
- self.fp.write('0' * (-count % 4)) # pad
-
- def _read(self):
- # Check magic bytes and version
- magic = self.fp.read(3)
- assert magic == 'CDF', "Error: %s is not a valid NetCDF 3 file" % self.filename
- self.__dict__['version_byte'] = fromstring(self.fp.read(1), '>b')[0]
-
- # Read file headers and set data.
- self._read_numrecs()
- self._read_dim_array()
- self._read_gatt_array()
- self._read_var_array()
-
- def _read_numrecs(self):
- self.__dict__['_recs'] = self._unpack_int()
-
- def _read_dim_array(self):
- header = self.fp.read(4)
- assert header in [ZERO, NC_DIMENSION]
- count = self._unpack_int()
-
- for dim in range(count):
- name = self._unpack_string()
- length = self._unpack_int() or None # None for record dimension
- self.dimensions[name] = length
- self._dims.append(name) # preserve order
-
- def _read_gatt_array(self):
- for k, v in self._read_att_array().items():
- self.__setattr__(k, v)
-
- def _read_att_array(self):
- header = self.fp.read(4)
- assert header in [ZERO, NC_ATTRIBUTE]
- count = self._unpack_int()
-
- attributes = {}
- for attr in range(count):
- name = self._unpack_string()
- attributes[name] = self._read_values()
- return attributes
-
- def _read_var_array(self):
- header = self.fp.read(4)
- assert header in [ZERO, NC_VARIABLE]
-
- begin = 0
- dtypes = {'names': [], 'formats': []}
- rec_vars = []
- count = self._unpack_int()
- for var in range(count):
- name, dimensions, shape, attributes, typecode, size, dtype_, begin_, vsize = self._read_var()
- if shape and shape[0] is None:
- rec_vars.append(name)
- self.__dict__['_recsize'] += vsize
- if begin == 0: begin = begin_
- dtypes['names'].append(name)
- dtypes['formats'].append(str(shape[1:]) + dtype_)
-
- # Handle padding with a virtual variable.
- if typecode in 'bch':
- actual_size = reduce(mul, (1,) + shape[1:]) * size
- padding = -actual_size % 4
- if padding:
- dtypes['names'].append('_padding_%d' % var)
- dtypes['formats'].append('(%d,)>b' % padding)
-
- # Data will be set later.
- data = None
- else:
- if self.use_mmap:
- mm = mmap(self.fp.fileno(), begin_+vsize, access=ACCESS_READ)
- data = ndarray.__new__(ndarray, shape, dtype=dtype_,
- buffer=mm, offset=begin_, order=0)
- else:
- pos = self.fp.tell()
- self.fp.seek(begin_)
- data = fromstring(self.fp.read(vsize), dtype=dtype_)
- data.shape = shape
- self.fp.seek(pos)
-
- # Add variable.
- self.variables[name] = netcdf_variable(
- data, typecode, shape, dimensions, attributes,
- maskandscale=self._maskandscale)
-
- if rec_vars:
- # Remove padding when only one record variable.
- if len(rec_vars) == 1:
- dtypes['names'] = dtypes['names'][:1]
- dtypes['formats'] = dtypes['formats'][:1]
-
- # Build rec array.
- if self.use_mmap:
- mm = mmap(self.fp.fileno(), begin+self._recs*self._recsize, access=ACCESS_READ)
- rec_array = ndarray.__new__(ndarray, (self._recs,), dtype=dtypes,
- buffer=mm, offset=begin, order=0)
- else:
- pos = self.fp.tell()
- self.fp.seek(begin)
- rec_array = fromstring(self.fp.read(self._recs*self._recsize), dtype=dtypes)
- rec_array.shape = (self._recs,)
- self.fp.seek(pos)
-
- for var in rec_vars:
- self.variables[var].__dict__['data'] = rec_array[var]
-
- def _read_var(self):
- name = self._unpack_string()
- dimensions = []
- shape = []
- dims = self._unpack_int()
-
- for i in range(dims):
- dimid = self._unpack_int()
- dimname = self._dims[dimid]
- dimensions.append(dimname)
- dim = self.dimensions[dimname]
- shape.append(dim)
- dimensions = tuple(dimensions)
- shape = tuple(shape)
-
- attributes = self._read_att_array()
- nc_type = self.fp.read(4)
- vsize = self._unpack_int()
- begin = [self._unpack_int, self._unpack_int64][self.version_byte-1]()
-
- typecode, size = TYPEMAP[nc_type]
- if typecode is 'c':
- dtype_ = '>c'
- else:
- dtype_ = '>%s' % typecode
- if size > 1: dtype_ += str(size)
-
- return name, dimensions, shape, attributes, typecode, size, dtype_, begin, vsize
-
- def _read_values(self):
- nc_type = self.fp.read(4)
- n = self._unpack_int()
-
- typecode, size = TYPEMAP[nc_type]
-
- count = n*size
- values = self.fp.read(count)
- self.fp.read(-count % 4) # read padding
-
- if typecode is not 'c':
- values = fromstring(values, dtype='>%s%d' % (typecode, size))
- if values.shape == (1,): values = values[0]
- else:
- values = values.rstrip('\x00')
- return values
-
- def _pack_begin(self, begin):
- if self.version_byte == 1:
- self._pack_int(begin)
- elif self.version_byte == 2:
- self._pack_int64(begin)
-
- def _pack_int(self, value):
- self.fp.write(array(value, '>i').tostring())
- _pack_int32 = _pack_int
-
- def _unpack_int(self):
- return fromstring(self.fp.read(4), '>i')[0]
- _unpack_int32 = _unpack_int
-
- def _pack_int64(self, value):
- self.fp.write(array(value, '>q').tostring())
-
- def _unpack_int64(self):
- return fromstring(self.fp.read(8), '>q')[0]
-
- def _pack_string(self, s):
- count = len(s)
- self._pack_int(count)
- self.fp.write(s)
- self.fp.write('0' * (-count % 4)) # pad
-
- def _unpack_string(self):
- count = self._unpack_int()
- s = self.fp.read(count).rstrip('\x00')
- self.fp.read(-count % 4) # read padding
- return s
-
-
-class netcdf_variable(object):
- """
- ``netcdf_variable`` objects are constructed by calling the method
- ``createVariable`` on the netcdf_file object.
-
- ``netcdf_variable`` objects behave much like array objects defined in
- Numpy, except that their data resides in a file. Data is read by
- indexing and written by assigning to an indexed subset; the entire
- array can be accessed by the index ``[:]`` or using the methods
- ``getValue`` and ``assignValue``. ``netcdf_variable`` objects also
- have attribute ``shape`` with the same meaning as for arrays, but
- the shape cannot be modified. There is another read-only attribute
- ``dimensions``, whose value is the tuple of dimension names.
-
- All other attributes correspond to variable attributes defined in
- the NetCDF file. Variable attributes are created by assigning to an
- attribute of the ``netcdf_variable`` object.
-
- """
- def __init__(self, data, typecode, shape, dimensions, attributes=None,\
- maskandscale=False):
- self.data = data
- self._typecode = typecode
- self._shape = shape
- self.dimensions = dimensions
- self._maskandscale = maskandscale
-
- self._attributes = attributes or {}
- for k, v in self._attributes.items():
- self.__dict__[k] = v
-
- def __setattr__(self, attr, value):
- # Store user defined attributes in a separate dict,
- # so we can save them to file later.
- try:
- self._attributes[attr] = value
- except AttributeError:
- pass
- self.__dict__[attr] = value
-
- @property
- def isrec(self):
- return self.data.shape and not self._shape[0]
-
- @property
- def shape(self):
- return self.data.shape
-
- def __len__(self):
- return self.data.shape[0]
-
- def getValue(self):
- return self.data.item()
-
- def assignValue(self, value):
- self.data.itemset(value)
-
- def typecode(self):
- return self._typecode
-
- def __getitem__(self, index):
- data = squeeze(self.data[index])
- if self._maskandscale:
- return _unmaskandscale(self,data)
- else:
- return data
-
- def __setitem__(self, index, data):
- if self._maskandscale:
- data = _maskandscale(self,data)
- # Expand data for record vars?
- if self.isrec:
- if isinstance(index, tuple):
- rec_index = index[0]
- else:
- rec_index = index
- if isinstance(rec_index, slice):
- recs = (rec_index.start or 0) + len(data)
- else:
- recs = rec_index + 1
- if recs > len(self.data):
- shape = (recs,) + self._shape[1:]
- self.data.resize(shape)
- self.data[index] = data
-
-
-NetCDFFile = netcdf_file
-NetCDFVariable = netcdf_variable
-
-# default _FillValue for netcdf types (apply also to corresponding
-# DAP types).
-_default_fillvals = {'c':'0円',
- 'S':"",
- 'b':-127,
- 'B':-127,
- 'h':-32767,
- 'H':65535,
- 'i':-2147483647L,
- 'L':4294967295L,
- 'q':-2147483647L,
- 'f':9.9692099683868690e+36,
- 'd':9.9692099683868690e+36}
-def _unmaskandscale(var,data):
- # if _maskandscale mode set to True, perform
- # automatic unpacking using scale_factor/add_offset
- # and automatic conversion to masked array using
- # missing_value/_Fill_Value.
- totalmask = zeros(data.shape, bool)
- fill_value = None
- if hasattr(var, 'missing_value') and (data == var.missing_value).any():
- mask=data==var.missing_value
- fill_value = var.missing_value
- totalmask += mask
- if hasattr(var, '_FillValue') and (data == var._FillValue).any():
- mask=data==var._FillValue
- if fill_value is None:
- fill_value = var._FillValue
- totalmask += mask
- else:
- fillval = _default_fillvals[var.typecode()]
- if (data == fillval).any():
- mask=data==fillval
- if fill_value is None:
- fill_value = fillval
- totalmask += mask
- # all values where data == missing_value or _FillValue are
- # masked. fill_value set to missing_value if it exists,
- # otherwise _FillValue.
- if fill_value is not None:
- data = ma.masked_array(data,mask=totalmask,fill_value=fill_value)
- # if variable has scale_factor and add_offset attributes, rescale.
- if hasattr(var, 'scale_factor') and hasattr(var, 'add_offset'):
- data = var.scale_factor*data + var.add_offset
- return data
-
-def _maskandscale(var,data):
- # if _maskandscale mode set to True, perform
- # automatic packing using scale_factor/add_offset
- # and automatic filling of masked arrays using
- # missing_value/_Fill_Value.
- # use missing_value as fill value.
- # if no missing value set, use _FillValue.
- if hasattr(data,'mask'):
- if hasattr(var, 'missing_value'):
- fillval = var.missing_value
- elif hasattr(var, '_FillValue'):
- fillval = var._FillValue
- else:
- fillval = _default_fillvals[var.typecode()]
- data = data.filled(fill_value=fillval)
- # pack using scale_factor and add_offset.
- if hasattr(var, 'scale_factor') and hasattr(var, 'add_offset'):
- data = (data - var.add_offset)/var.scale_factor
- return data
This was sent by the SourceForge.net collaborative development platform, the world's largest Open Source development site.
From: <js...@us...> - 2011年02月12日 14:01:41
Revision: 8979
 http://matplotlib.svn.sourceforge.net/matplotlib/?rev=8979&view=rev
Author: jswhit
Date: 2011年02月12日 14:01:34 +0000 (2011年2月12日)
Log Message:
-----------
use netcdf4-python instead of built-in NetCDFFile.
Modified Paths:
--------------
 trunk/toolkits/basemap/MANIFEST.in
 trunk/toolkits/basemap/examples/ccsm_popgrid.py
 trunk/toolkits/basemap/examples/fcstmaps.py
 trunk/toolkits/basemap/examples/fcstmaps_axesgrid.py
 trunk/toolkits/basemap/examples/lic_demo.py
 trunk/toolkits/basemap/examples/plothighsandlows.py
 trunk/toolkits/basemap/examples/ploticos.py
 trunk/toolkits/basemap/examples/plotprecip.py
 trunk/toolkits/basemap/examples/plotsst.py
 trunk/toolkits/basemap/examples/pnganim.py
Removed Paths:
-------------
 trunk/toolkits/basemap/examples/NetCDFFile_tst.py
Modified: trunk/toolkits/basemap/MANIFEST.in
===================================================================
--- trunk/toolkits/basemap/MANIFEST.in	2011年02月12日 13:58:04 UTC (rev 8978)
+++ trunk/toolkits/basemap/MANIFEST.in	2011年02月12日 14:01:34 UTC (rev 8979)
@@ -80,7 +80,6 @@
 include examples/C02562.orog.nc
 include examples/ccsm_popgrid.nc
 include examples/rita.nc
-include examples/NetCDFFile_tst.py
 include examples/maskoceans.py
 include examples/README
 include lib/mpl_toolkits/__init__.py
Deleted: trunk/toolkits/basemap/examples/NetCDFFile_tst.py
===================================================================
--- trunk/toolkits/basemap/examples/NetCDFFile_tst.py	2011年02月12日 13:58:04 UTC (rev 8978)
+++ trunk/toolkits/basemap/examples/NetCDFFile_tst.py	2011年02月12日 14:01:34 UTC (rev 8979)
@@ -1,70 +0,0 @@
-import sys
-import unittest
-import os
-import tempfile
-from numpy import ma
-from numpy.testing import assert_array_equal, assert_array_almost_equal
-from numpy.random.mtrand import uniform 
-try:
- from netCDF4 import Dataset as NetCDFFile
-except ImportError:
- from mpl_toolkits.basemap import NetCDFFile
-
-# test automatic conversion of masked arrays, and
-# packing/unpacking of short ints.
-
-FILE_NAME = tempfile.mktemp(".nc")
-ndim = 10
-ranarr = 100.*uniform(size=(ndim))
-packeddata = 10.*uniform(size=(ndim))
-missing_value = -9999.
-ranarr[::2] = missing_value
-maskedarr = ma.masked_values(ranarr,-9999.)
-scale_factor = (packeddata.max()-packeddata.min())/(2.*32766.)
-add_offset = 0.5*(packeddata.max()+packeddata.min())
-packeddata2 = ((packeddata-add_offset)/scale_factor).astype('i2')
-
-class TestCase(unittest.TestCase):
-
- def setUp(self):
- self.file = FILE_NAME
- file = NetCDFFile(self.file,'w')
- file.createDimension('n', None) # use unlimited dim.
- foo = file.createVariable('maskeddata', 'f8', ('n',))
- foo.missing_value = missing_value
- bar = file.createVariable('packeddata', 'i2', ('n',))
- bar.scale_factor = scale_factor
- bar.add_offset = add_offset
- foo[0:ndim] = maskedarr
- bar[0:ndim] = packeddata
- file.close()
-
- def tearDown(self):
- # Remove the temporary files
- os.remove(self.file)
-
- def runTest(self):
- """testing auto-conversion of masked arrays and packed integers""" 
- # no auto-conversion.
- file = NetCDFFile(self.file,maskandscale=False)
- datamasked = file.variables['maskeddata']
- datapacked = file.variables['packeddata']
- if hasattr(datapacked,'set_auto_maskandscale'):
- datapacked.set_auto_maskandscale(False)
- # check missing_value, scale_factor and add_offset attributes.
- assert datamasked.missing_value == missing_value
- assert datapacked.scale_factor == scale_factor
- assert datapacked.add_offset == add_offset
- assert_array_equal(datapacked[:],packeddata2)
- assert_array_almost_equal(datamasked[:],ranarr)
- file.close()
- # auto-conversion
- file = NetCDFFile(self.file)
- datamasked = file.variables['maskeddata']
- datapacked = file.variables['packeddata']
- assert_array_almost_equal(datamasked[:].filled(datamasked.missing_value),ranarr)
- assert_array_almost_equal(datapacked[:],packeddata,decimal=4)
- file.close()
-
-if __name__ == '__main__':
- unittest.main()
Modified: trunk/toolkits/basemap/examples/ccsm_popgrid.py
===================================================================
--- trunk/toolkits/basemap/examples/ccsm_popgrid.py	2011年02月12日 13:58:04 UTC (rev 8978)
+++ trunk/toolkits/basemap/examples/ccsm_popgrid.py	2011年02月12日 14:01:34 UTC (rev 8979)
@@ -25,10 +25,7 @@
 import numpy as np
 import matplotlib.pyplot as plt
 from mpl_toolkits.basemap import Basemap
-try:
- from netCDF4 import Dataset as NetCDFFile
-except ImportError:
- from mpl_toolkits.basemap import NetCDFFile
+from netCDF4 import Dataset as NetCDFFile
 
 # read in data from netCDF file.
 infile = 'ccsm_popgrid.nc'
Modified: trunk/toolkits/basemap/examples/fcstmaps.py
===================================================================
--- trunk/toolkits/basemap/examples/fcstmaps.py	2011年02月12日 13:58:04 UTC (rev 8978)
+++ trunk/toolkits/basemap/examples/fcstmaps.py	2011年02月12日 14:01:34 UTC (rev 8979)
@@ -6,10 +6,7 @@
 import numpy.ma as ma
 import datetime
 from mpl_toolkits.basemap import Basemap, addcyclic, num2date
-try:
- from netCDF4 import Dataset as NetCDFFile
-except ImportError:
- from mpl_toolkits.basemap import NetCDFFile
+from netCDF4 import Dataset as NetCDFFile
 
 
 # today's date is default.
Modified: trunk/toolkits/basemap/examples/fcstmaps_axesgrid.py
===================================================================
--- trunk/toolkits/basemap/examples/fcstmaps_axesgrid.py	2011年02月12日 13:58:04 UTC (rev 8978)
+++ trunk/toolkits/basemap/examples/fcstmaps_axesgrid.py	2011年02月12日 14:01:34 UTC (rev 8979)
@@ -8,10 +8,7 @@
 import datetime
 from mpl_toolkits.basemap import Basemap, addcyclic, num2date
 from mpl_toolkits.axes_grid1 import AxesGrid
-try:
- from netCDF4 import Dataset as NetCDFFile
-except ImportError:
- from mpl_toolkits.basemap import NetCDFFile
+from netCDF4 import Dataset as NetCDFFile
 
 
 # today's date is default.
Modified: trunk/toolkits/basemap/examples/lic_demo.py
===================================================================
--- trunk/toolkits/basemap/examples/lic_demo.py	2011年02月12日 13:58:04 UTC (rev 8978)
+++ trunk/toolkits/basemap/examples/lic_demo.py	2011年02月12日 14:01:34 UTC (rev 8979)
@@ -1,10 +1,7 @@
 # example showing how to use Line Integral Convolution to visualize a vector
 # flow field (from Hurricane Earl). Produces something akin to streamlines.
 # Requires vectorplot scikit (http://scikits.appspot.com/vectorplot).
-try:
- from netCDF4 import Dataset as NetCDFFile
-except ImportError:
- from mpl_toolkits.basemap import NetCDFFile
+from netCDF4 import Dataset as NetCDFFile
 from mpl_toolkits.basemap import Basemap
 import numpy as np
 import matplotlib.pyplot as plt
Modified: trunk/toolkits/basemap/examples/plothighsandlows.py
===================================================================
--- trunk/toolkits/basemap/examples/plothighsandlows.py	2011年02月12日 13:58:04 UTC (rev 8978)
+++ trunk/toolkits/basemap/examples/plothighsandlows.py	2011年02月12日 14:01:34 UTC (rev 8979)
@@ -7,10 +7,7 @@
 import sys
 from mpl_toolkits.basemap import Basemap, addcyclic
 from scipy.ndimage.filters import minimum_filter, maximum_filter
-try:
- from netCDF4 import Dataset as NetCDFFile
-except ImportError:
- from mpl_toolkits.basemap import NetCDFFile
+from netCDF4 import Dataset as NetCDFFile
 
 def extrema(mat,mode='wrap',window=10):
 """find the indices of local extrema (min and max)
Modified: trunk/toolkits/basemap/examples/ploticos.py
===================================================================
--- trunk/toolkits/basemap/examples/ploticos.py	2011年02月12日 13:58:04 UTC (rev 8978)
+++ trunk/toolkits/basemap/examples/ploticos.py	2011年02月12日 14:01:34 UTC (rev 8979)
@@ -2,10 +2,7 @@
 import matplotlib.pyplot as plt
 import numpy as np
 from numpy import ma
-try:
- from netCDF4 import Dataset as NetCDFFile
-except ImportError:
- from mpl_toolkits.basemap import NetCDFFile
+from netCDF4 import Dataset as NetCDFFile
 # read in orography of icosahedral global grid.
 f = NetCDFFile('C02562.orog.nc')
 lons = (180./np.pi)*f.variables['grid_center_lon'][:]
Modified: trunk/toolkits/basemap/examples/plotprecip.py
===================================================================
--- trunk/toolkits/basemap/examples/plotprecip.py	2011年02月12日 13:58:04 UTC (rev 8978)
+++ trunk/toolkits/basemap/examples/plotprecip.py	2011年02月12日 14:01:34 UTC (rev 8979)
@@ -1,8 +1,5 @@
 from mpl_toolkits.basemap import Basemap, cm
-try:
- from netCDF4 import Dataset as NetCDFFile
-except ImportError:
- from mpl_toolkits.basemap import NetCDFFile
+from netCDF4 import Dataset as NetCDFFile
 import numpy as np
 import matplotlib.pyplot as plt
 import copy
Modified: trunk/toolkits/basemap/examples/plotsst.py
===================================================================
--- trunk/toolkits/basemap/examples/plotsst.py	2011年02月12日 13:58:04 UTC (rev 8978)
+++ trunk/toolkits/basemap/examples/plotsst.py	2011年02月12日 14:01:34 UTC (rev 8979)
@@ -1,8 +1,5 @@
 from mpl_toolkits.basemap import Basemap, date2index, num2date
-try:
- from netCDF4 import Dataset as NetCDFFile
-except:
- from mpl_toolkits.basemap import NetCDFFile
+from netCDF4 import Dataset as NetCDFFile
 import numpy as np
 import matplotlib.pyplot as plt
 import sys, datetime
Modified: trunk/toolkits/basemap/examples/pnganim.py
===================================================================
--- trunk/toolkits/basemap/examples/pnganim.py	2011年02月12日 13:58:04 UTC (rev 8978)
+++ trunk/toolkits/basemap/examples/pnganim.py	2011年02月12日 14:01:34 UTC (rev 8979)
@@ -10,10 +10,7 @@
 import numpy.ma as ma
 import datetime, sys, time, subprocess
 from mpl_toolkits.basemap import Basemap, shiftgrid, date2index, num2date
-try:
- from netCDF4 import Dataset as NetCDFFile
-except:
- from mpl_toolkits.basemap import NetCDFFile
+from netCDF4 import Dataset as NetCDFFile
 
 # times for March 1993 'storm of the century'
 date1 = datetime.datetime(1993,3,10,0)
This was sent by the SourceForge.net collaborative development platform, the world's largest Open Source development site.
From: <js...@us...> - 2011年02月12日 13:58:10
Revision: 8978
 http://matplotlib.svn.sourceforge.net/matplotlib/?rev=8978&view=rev
Author: jswhit
Date: 2011年02月12日 13:58:04 +0000 (2011年2月12日)
Log Message:
-----------
httplib2 and dap removed
Modified Paths:
--------------
 trunk/toolkits/basemap/setup.cfg
Modified: trunk/toolkits/basemap/setup.cfg
===================================================================
--- trunk/toolkits/basemap/setup.cfg	2011年02月12日 13:56:44 UTC (rev 8977)
+++ trunk/toolkits/basemap/setup.cfg	2011年02月12日 13:58:04 UTC (rev 8978)
@@ -6,6 +6,4 @@
 # False: do not install
 # auto: install only if the package is unavailable. This
 # is the default behavior
-pydap = auto
-httplib2 = auto
 pyshapelib = auto
This was sent by the SourceForge.net collaborative development platform, the world's largest Open Source development site.
Revision: 8977
 http://matplotlib.svn.sourceforge.net/matplotlib/?rev=8977&view=rev
Author: jswhit
Date: 2011年02月12日 13:56:44 +0000 (2011年2月12日)
Log Message:
-----------
remove more vestiges of netcdf support
Modified Paths:
--------------
 trunk/toolkits/basemap/lib/mpl_toolkits/basemap/__init__.py
Removed Paths:
-------------
 trunk/toolkits/basemap/lib/mpl_toolkits/basemap/netcdf.py
 trunk/toolkits/basemap/lib/mpl_toolkits/basemap/netcdftime.py
Modified: trunk/toolkits/basemap/lib/mpl_toolkits/basemap/__init__.py
===================================================================
--- trunk/toolkits/basemap/lib/mpl_toolkits/basemap/__init__.py	2011年02月12日 13:54:34 UTC (rev 8976)
+++ trunk/toolkits/basemap/lib/mpl_toolkits/basemap/__init__.py	2011年02月12日 13:56:44 UTC (rev 8977)
@@ -11,12 +11,6 @@
 :func:`shiftgrid`: shifts global lat/lon grids east or west.
 
 :func:`addcyclic`: Add cyclic (wraparound) point in longitude.
-
-:func:`num2date`: convert from a numeric time value to a datetime object.
-
-:func:`date2num`: convert from a datetime object to a numeric time value.
-
-:func:`date2index`: compute a time variable index corresponding to a date.
 """
 from matplotlib import __version__ as _matplotlib_version
 from matplotlib.cbook import is_scalar, dedent
@@ -38,7 +32,7 @@
 import numpy as np
 import numpy.ma as ma
 from shapelib import ShapeFile
-import _geoslib, netcdftime
+import _geoslib
 
 # basemap data files now installed in lib/matplotlib/toolkits/basemap/data
 # check to see if environment variable BASEMAPDATA set to a directory,
@@ -3974,160 +3968,6 @@
 else:
 return corners
 
-def num2date(times,units='days since 0001年01月01日 00:00:00',calendar='proleptic_gregorian'):
- """
- Return datetime objects given numeric time values. The units
- of the numeric time values are described by the ``units`` argument
- and the ``calendar`` keyword. The returned datetime objects represent
- UTC with no time-zone offset, even if the specified
- units contain a time-zone offset.
-
- Default behavior is the same as the matplotlib.dates.num2date function
- but the reference time and calendar can be changed via the
- ``units`` and ``calendar`` keywords.
-
- .. tabularcolumns:: |l|L|
-
- ============== ====================================================
- Arguments Description
- ============== ====================================================
- times numeric time values. Maximum resolution is 1 second.
- ============== ====================================================
-
- .. tabularcolumns:: |l|L|
-
- ============== ====================================================
- Keywords Description
- ============== ====================================================
- units a string of the form '<time units> since
- <reference time>' describing the units and
- origin of the time coordinate.
- <time units> can be days, hours, minutes
- or seconds. <reference time> is the time origin.
- Default is 'days since 0001年01月01日 00:00:00'.
- calendar describes the calendar used in the time
- calculations. All the values currently defined in
- the CF metadata convention
- (http://cf-pcmdi.llnl.gov/documents/cf-conventions/)
- are supported.
- Valid calendars ``standard``, ``gregorian``,
- ``proleptic_gregorian``, ``noleap``, ``365_day``,
- ``julian``, ``all_leap``, ``366_day``.
- Default is ``proleptic_gregorian``.
- ============== ====================================================
-
- Returns a datetime instance, or an array of datetime instances.
-
- The datetime instances returned are 'real' python datetime
- objects if the date falls in the Gregorian calendar (i.e.
- calendar=``proleptic_gregorian``, or calendar = ``standard``
- or ``gregorian`` and the date is after 1582年10月15日).
- Otherwise, they are 'phony' datetime
- objects which support some but not all the methods of 'real' python
- datetime objects. The datetime instances do not contain
- a time-zone offset, even if the specified units contains one.
- """
- cdftime = netcdftime.utime(units,calendar=calendar)
- return cdftime.num2date(times)
-
-def date2num(dates,units='days since 0001年01月01日 00:00:00',calendar='proleptic_gregorian'):
- """
- Return numeric time values given datetime objects. The units
- of the numeric time values are described by the ``units`` argument
- and the ``calendar`` keyword. The datetime objects must
- be in UTC with no time-zone offset. If there is a
- time-zone offset in units, it will be applied to the
- returned numeric values.
-
- Default behavior is the same as the matplotlib.dates.date2num function
- but the reference time and calendar can be changed via the
- ``units`` and ``calendar`` keywords.
-
- .. tabularcolumns:: |l|L|
-
- ============== ====================================================
- Arguments Description
- ============== ====================================================
- dates A datetime object or a sequence of datetime objects.
- The datetime objects should not include a
- time-zone offset.
- ============== ====================================================
-
- .. tabularcolumns:: |l|L|
-
- ============== ====================================================
- Keywords Description
- ============== ====================================================
- units a string of the form '<time units> since
- <reference time>' describing the units and
- origin of the time coordinate.
- <time units> can be days, hours, minutes
- or seconds. <reference time> is the time origin.
- Default is 'days since 0001年01月01日 00:00:00'.
- calendar describes the calendar used in the time
- calculations. All the values currently defined in
- the CF metadata convention
- (http://cf-pcmdi.llnl.gov/documents/cf-conventions/)
- are supported.
- Valid calendars ``standard``, ``gregorian``,
- ``proleptic_gregorian``, ``noleap``, ``365_day``,
- ``julian``, ``all_leap``, ``366_day``.
- Default is ``proleptic_gregorian``.
- ============== ====================================================
-
- Returns a numeric time value, or an array of numeric time values.
-
- The maximum resolution of the numeric time values is 1 second.
- """
- cdftime = netcdftime.utime(units,calendar=calendar)
- return cdftime.date2num(dates)
-
-def date2index(dates, nctime, calendar=None,select='exact'):
- """
- Return indices of a netCDF time variable corresponding to the given dates.
-
- .. tabularcolumns:: |l|L|
-
- ============== ====================================================
- Arguments Description
- ============== ====================================================
- dates A datetime object or a sequence of datetime objects.
- The datetime objects should not include a
- time-zone offset.
- nctime A netCDF time variable object. The nctime object
- must have a ``units`` attribute.
- ============== ====================================================
-
- .. tabularcolumns:: |l|L|
-
- ============== ====================================================
- Keywords Description
- ============== ====================================================
- calendar describes the calendar used in the time
- calculations. All the values currently defined in
- the CF metadata convention
- (http://cf-pcmdi.llnl.gov/documents/cf-conventions/)
- are supported.
- Valid calendars ``standard``, ``gregorian``,
- ``proleptic_gregorian``, ``noleap``, ``365_day``,
- ``julian``, ``all_leap``, ``366_day``.
- If ``calendar=None``, will use ``calendar`` attribute
- of ``nctime`` object, and if that attribute does
- not exist calendar is set to ``standard``.
- Default is ``None``.
- select The index selection method. ``exact`` will return the
- indices perfectly matching the dates given.
- ``before`` and ``after`` will return the indices
- corresponding to the dates just before or after
- the given dates if an exact match cannot be found.
- ``nearest`` will return the indices that
- correspond to the closest dates. Default ``exact``.
- ============== ====================================================
-
- Returns an index or a sequence of indices.
- """
- return netcdftime.date2index(dates, nctime, calendar=calendar, select=select)
-
 def maskoceans(lonsin,latsin,datain,inlands=False):
 """
 mask data (``datain``), defined on a grid with latitudes ``latsin``
Deleted: trunk/toolkits/basemap/lib/mpl_toolkits/basemap/netcdf.py
===================================================================
--- trunk/toolkits/basemap/lib/mpl_toolkits/basemap/netcdf.py	2011年02月12日 13:54:34 UTC (rev 8976)
+++ trunk/toolkits/basemap/lib/mpl_toolkits/basemap/netcdf.py	2011年02月12日 13:56:44 UTC (rev 8977)
@@ -1,81 +0,0 @@
-from numpy import ma, squeeze
-from pupynere import netcdf_file, _unmaskandscale
-from dap.client import open as open_remote
-from dap.dtypes import ArrayType, GridType, typemap
-
-_typecodes = dict([[_v,_k] for _k,_v in typemap.items()])
-
-class _RemoteFile(object):
- """A NetCDF file reader. API is the same as Scientific.IO.NetCDF."""
-
- def __init__(self, file, maskandscale=False, cache=None,\
- username=None, password=None, verbose=False):
- self._buffer = open_remote(file,cache=cache,\
- username=username,password=password,verbose=verbose)
- self._maskandscale = maskandscale
- self._parse()
-
- def read(self, size=-1):
- """Alias for reading the file buffer."""
- return self._buffer.read(size)
-
- def _parse(self):
- """Initial parsing of the header."""
- # Read header info.
- self._dim_array()
- self._gatt_array()
- self._var_array()
-
- def _dim_array(self):
- """Read a dict with dimensions names and sizes."""
- self.dimensions = {}
- self._dims = []
- for k,d in self._buffer.iteritems():
- if (isinstance(d, ArrayType) or isinstance(d, GridType)) and len(d.shape) == 1 and k == d.dimensions[0]:
- name = k
- length = len(d)
- self.dimensions[name] = length
- self._dims.append(name) # preserve dim order
-
- def _gatt_array(self):
- """Read global attributes."""
- self.__dict__.update(self._buffer.attributes)
-
- def _var_array(self):
- """Read all variables."""
- # Read variables.
- self.variables = {}
- for k,d in self._buffer.iteritems():
- if isinstance(d, GridType) or isinstance(d, ArrayType):
- name = k
- self.variables[name] = _RemoteVariable(d,self._maskandscale)
-
- def close(self):
- # this is a no-op provided for compatibility
- pass
-
-class _RemoteVariable(object):
- def __init__(self, var, maskandscale):
- self._var = var
- self._maskandscale = maskandscale
- self.dtype = var.type
- self.shape = var.shape
- self.dimensions = var.dimensions
- self.__dict__.update(var.attributes)
-
- def __getitem__(self, index):
- datout = squeeze(self._var.__getitem__(index))
- # automatically
- # - remove singleton dimensions
- # - create a masked array using missing_value or _FillValue attribute
- # - apply scale_factor and add_offset to packed integer data
- if self._maskandscale:
- return _unmaskandscale(self,datout)
- else:
- return datout
-
- def __len__(self):
- return self.shape[0]
-
- def typecode(self):
- return _typecodes[self.dtype]
Deleted: trunk/toolkits/basemap/lib/mpl_toolkits/basemap/netcdftime.py
===================================================================
--- trunk/toolkits/basemap/lib/mpl_toolkits/basemap/netcdftime.py	2011年02月12日 13:54:34 UTC (rev 8976)
+++ trunk/toolkits/basemap/lib/mpl_toolkits/basemap/netcdftime.py	2011年02月12日 13:56:44 UTC (rev 8977)
@@ -1,1050 +0,0 @@
-"""
-Performs conversions of netCDF time coordinate data to/from datetime objects.
-"""
-import math, numpy, re, time
-from datetime import datetime as real_datetime
-
-_units = ['days','hours','minutes','seconds','day','hour','minute','second']
-_calendars = ['standard','gregorian','proleptic_gregorian','noleap','julian','all_leap','365_day','366_day','360_day']
-
-__version__ = '0.7.1'
-
-class datetime:
- """
-Phony datetime object which mimics the python datetime object,
-but allows for dates that don't exist in the proleptic gregorian calendar.
-Doesn't do timedelta operations, doesn't overload + and -.
-
-Has strftime, timetuple and __repr__ methods. The format
-of the string produced by __repr__ is controlled by self.format
-(default %Y-%m-%d %H:%M:%S).
-
-Instance variables are year,month,day,hour,minute,second,dayofwk,dayofyr
-and format.
- """
- def __init__(self,year,month,day,hour=0,minute=0,second=0,dayofwk=-1,dayofyr=1):
- """dayofyr set to 1 by default - otherwise time.strftime will complain"""
- self.year=year
- self.month=month
- self.day=day
- self.hour=hour
- self.minute=minute
- self.dayofwk=dayofwk
- self.dayofyr=dayofyr
- self.second=second
- self.format='%Y-%m-%d %H:%M:%S'
- def strftime(self,format=None):
- if format is None:
- format = self.format
- return _strftime(self,format)
- def timetuple(self):
- return (self.year,self.month,self.day,self.hour,self.minute,self.second,self.dayofwk,self.dayofyr,-1)
- def __repr__(self):
- return self.strftime(self.format)
-
-def JulianDayFromDate(date,calendar='standard'):
-
- """
-
-creates a Julian Day from a 'datetime-like' object. Returns the fractional
-Julian Day (resolution 1 second).
-
-if calendar='standard' or 'gregorian' (default), Julian day follows Julian
-Calendar on and before 1582年10月5日, Gregorian calendar after 1582年10月15日.
-
-if calendar='proleptic_gregorian', Julian Day follows gregorian calendar.
-
-if calendar='julian', Julian Day follows julian calendar.
-
-Algorithm:
-
-Meeus, Jean (1998) Astronomical Algorithms (2nd Edition). Willmann-Bell,
-Virginia. p. 63
-
- """
-
- # based on redate.py by David Finlayson.
-
- year=date.year; month=date.month; day=date.day
- hour=date.hour; minute=date.minute; second=date.second
- # Convert time to fractions of a day
- day = day + hour/24.0 + minute/1440.0 + second/86400.0
-
- # Start Meeus algorithm (variables are in his notation)
- if (month < 3):
- month = month + 12
- year = year - 1
-
- A = int(year/100)
-
- jd = int(365.25 * (year + 4716)) + int(30.6001 * (month + 1)) + \
- day - 1524.5
-
- # optionally adjust the jd for the switch from
- # the Julian to Gregorian Calendar
- # here assumed to have occurred the day after 1582 October 4
- if calendar in ['standard','gregorian']:
- if jd >= 2299170.5:
- # 1582 October 15 (Gregorian Calendar)
- B = 2 - A + int(A/4)
- elif jd < 2299160.5:
- # 1582 October 5 (Julian Calendar)
- B = 0
- else:
- raise ValueError, 'impossible date (falls in gap between end of Julian calendar and beginning of Gregorian calendar'
- elif calendar == 'proleptic_gregorian':
- B = 2 - A + int(A/4)
- elif calendar == 'julian':
- B = 0
- else:
- raise ValueError, 'unknown calendar, must be one of julian,standard,gregorian,proleptic_gregorian, got %s' % calendar
-
- # adjust for Julian calendar if necessary
- jd = jd + B
-
- return jd
-
-def _NoLeapDayFromDate(date):
-
- """
-
-creates a Julian Day for a calendar with no leap years from a datetime
-instance. Returns the fractional Julian Day (resolution 1 second).
-
- """
-
- year=date.year; month=date.month; day=date.day
- hour=date.hour; minute=date.minute; second=date.second
- # Convert time to fractions of a day
- day = day + hour/24.0 + minute/1440.0 + second/86400.0
-
- # Start Meeus algorithm (variables are in his notation)
- if (month < 3):
- month = month + 12
- year = year - 1
-
- jd = int(365. * (year + 4716)) + int(30.6001 * (month + 1)) + \
- day - 1524.5
-
- return jd
-
-def _AllLeapFromDate(date):
-
- """
-
-creates a Julian Day for a calendar where all years have 366 days from
-a 'datetime-like' object.
-Returns the fractional Julian Day (resolution 1 second).
-
- """
-
- year=date.year; month=date.month; day=date.day
- hour=date.hour; minute=date.minute; second=date.second
- # Convert time to fractions of a day
- day = day + hour/24.0 + minute/1440.0 + second/86400.0
-
- # Start Meeus algorithm (variables are in his notation)
- if (month < 3):
- month = month + 12
- year = year - 1
-
- jd = int(366. * (year + 4716)) + int(30.6001 * (month + 1)) + \
- day - 1524.5
-
- return jd
-
-def _360DayFromDate(date):
-
- """
-
-creates a Julian Day for a calendar where all months have 30 daysfrom
-a 'datetime-like' object.
-Returns the fractional Julian Day (resolution 1 second).
-
- """
-
- year=date.year; month=date.month; day=date.day
- hour=date.hour; minute=date.minute; second=date.second
- # Convert time to fractions of a day
- day = day + hour/24.0 + minute/1440.0 + second/86400.0
-
- jd = int(360. * (year + 4716)) + int(30. * (month - 1)) + day
-
- return jd
-
-def DateFromJulianDay(JD,calendar='standard'):
- """
-
-returns a 'datetime-like' object given Julian Day. Julian Day is a
-fractional day with a resolution of 1 second.
-
-if calendar='standard' or 'gregorian' (default), Julian day follows Julian
-Calendar on and before 1582年10月5日, Gregorian calendar after 1582年10月15日.
-
-if calendar='proleptic_gregorian', Julian Day follows gregorian calendar.
-
-if calendar='julian', Julian Day follows julian calendar.
-
-The datetime object is a 'real' datetime object if the date falls in
-the Gregorian calendar (i.e. calendar='proleptic_gregorian', or
-calendar = 'standard'/'gregorian' and the date is after 1582年10月15日).
-Otherwise, it's a 'phony' datetime object which is actually an instance
-of netcdftime.datetime.
-
-
-Algorithm:
-
-Meeus, Jean (1998) Astronomical Algorithms (2nd Edition). Willmann-Bell,
-Virginia. p. 63
-
- """
-
- # based on redate.py by David Finlayson.
-
- if JD < 0:
- raise ValueError, 'Julian Day must be positive'
-
- dayofwk = int(math.fmod(int(JD + 1.5),7))
- (F, Z) = math.modf(JD + 0.5)
- Z = int(Z)
- if calendar in ['standard','gregorian']:
- if JD < 2299160.5:
- A = Z
- else:
- alpha = int((Z - 1867216.25)/36524.25)
- A = Z + 1 + alpha - int(alpha/4)
-
- elif calendar == 'proleptic_gregorian':
- alpha = int((Z - 1867216.25)/36524.25)
- A = Z + 1 + alpha - int(alpha/4)
- elif calendar == 'julian':
- A = Z
- else:
- raise ValueError, 'unknown calendar, must be one of julian,standard,gregorian,proleptic_gregorian, got %s' % calendar
-
- B = A + 1524
- C = int((B - 122.1)/365.25)
- D = int(365.25 * C)
- E = int((B - D)/30.6001)
-
- # Convert to date
- day = B - D - int(30.6001 * E) + F
- nday = B-D-123
- if nday <= 305:
- dayofyr = nday+60
- else:
- dayofyr = nday-305
- if E < 14:
- month = E - 1
- else:
- month = E - 13
-
- if month > 2:
- year = C - 4716
- else:
- year = C - 4715
-
- # a leap year?
- leap = 0
- if year % 4 == 0:
- leap = 1
- if calendar == 'proleptic_gregorian' or \
- (calendar in ['standard','gregorian'] and JD >= 2299160.5):
- if year % 100 == 0 and year % 400 != 0:
- print year % 100, year % 400
- leap = 0
- if leap and month > 2:
- dayofyr = dayofyr + leap
-
- # Convert fractions of a day to time
- (dfrac, days) = math.modf(day/1.0)
- (hfrac, hours) = math.modf(dfrac * 24.0)
- (mfrac, minutes) = math.modf(hfrac * 60.0)
- seconds = round(mfrac * 60.0) # seconds are rounded
-
- if seconds > 59:
- seconds = 0
- minutes = minutes + 1
- if minutes > 59:
- minutes = 0
- hours = hours + 1
- if hours > 23:
- hours = 0
- days = days + 1
-
- # return a 'real' datetime instance if calendar is gregorian.
- if calendar == 'proleptic_gregorian' or \
- (calendar in ['standard','gregorian'] and JD >= 2299160.5):
- return real_datetime(year,month,int(days),int(hours),int(minutes),int(seconds))
- else:
- # or else, return a 'datetime-like' instance.
- return datetime(year,month,int(days),int(hours),int(minutes),int(seconds),dayofwk,dayofyr)
-
-def _DateFromNoLeapDay(JD):
- """
-
-returns a 'datetime-like' object given Julian Day for a calendar with no leap
-days. Julian Day is a fractional day with a resolution of 1 second.
-
- """
-
- # based on redate.py by David Finlayson.
-
- if JD < 0:
- raise ValueError, 'Julian Day must be positive'
-
- dayofwk = int(math.fmod(int(JD + 1.5),7))
- (F, Z) = math.modf(JD + 0.5)
- Z = int(Z)
- A = Z
- B = A + 1524
- C = int((B - 122.1)/365.)
- D = int(365. * C)
- E = int((B - D)/30.6001)
-
- # Convert to date
- day = B - D - int(30.6001 * E) + F
- nday = B-D-123
- if nday <= 305:
- dayofyr = nday+60
- else:
- dayofyr = nday-305
- if E < 14:
- month = E - 1
- else:
- month = E - 13
-
- if month > 2:
- year = C - 4716
- else:
- year = C - 4715
-
- # Convert fractions of a day to time
- (dfrac, days) = math.modf(day/1.0)
- (hfrac, hours) = math.modf(dfrac * 24.0)
- (mfrac, minutes) = math.modf(hfrac * 60.0)
- seconds = round(mfrac * 60.0) # seconds are rounded
-
- if seconds > 59:
- seconds = 0
- minutes = minutes + 1
- if minutes > 59:
- minutes = 0
- hours = hours + 1
- if hours > 23:
- hours = 0
- days = days + 1
-
- return datetime(year,month,int(days),int(hours),int(minutes),int(seconds), dayofwk, dayofyr)
-
-def _DateFromAllLeap(JD):
- """
-
-returns a 'datetime-like' object given Julian Day for a calendar where all
-years have 366 days.
-Julian Day is a fractional day with a resolution of 1 second.
-
- """
-
- # based on redate.py by David Finlayson.
-
- if JD < 0:
- raise ValueError, 'Julian Day must be positive'
-
- dayofwk = int(math.fmod(int(JD + 1.5),7))
- (F, Z) = math.modf(JD + 0.5)
- Z = int(Z)
- A = Z
- B = A + 1524
- C = int((B - 122.1)/366.)
- D = int(366. * C)
- E = int((B - D)/30.6001)
-
- # Convert to date
- day = B - D - int(30.6001 * E) + F
- nday = B-D-123
- if nday <= 305:
- dayofyr = nday+60
- else:
- dayofyr = nday-305
- if E < 14:
- month = E - 1
- else:
- month = E - 13
- if month > 2:
- dayofyr = dayofyr+1
-
- if month > 2:
- year = C - 4716
- else:
- year = C - 4715
-
- # Convert fractions of a day to time
- (dfrac, days) = math.modf(day/1.0)
- (hfrac, hours) = math.modf(dfrac * 24.0)
- (mfrac, minutes) = math.modf(hfrac * 60.0)
- seconds = round(mfrac * 60.0) # seconds are rounded
-
- if seconds > 59:
- seconds = 0
- minutes = minutes + 1
- if minutes > 59:
- minutes = 0
- hours = hours + 1
- if hours > 23:
- hours = 0
- days = days + 1
-
- return datetime(year,month,int(days),int(hours),int(minutes),int(seconds), dayofwk, dayofyr)
-
-def _DateFrom360Day(JD):
- """
-
-returns a 'datetime-like' object given Julian Day for a calendar where all
-months have 30 days.
-Julian Day is a fractional day with a resolution of 1 second.
-
- """
-
- if JD < 0:
- raise ValueError, 'Julian Day must be positive'
-
- #jd = int(360. * (year + 4716)) + int(30. * (month - 1)) + day
- (F, Z) = math.modf(JD)
- year = int((Z-0.5)/360.) - 4716
- dayofyr = JD - (year+4716)*360
- month = int((dayofyr-0.5)/30)+1
- day = dayofyr - (month-1)*30 + F
-
- # Convert fractions of a day to time
- (dfrac, days) = math.modf(day/1.0)
- (hfrac, hours) = math.modf(dfrac * 24.0)
- (mfrac, minutes) = math.modf(hfrac * 60.0)
- seconds = round(mfrac * 60.0) # seconds are rounded
-
- if seconds > 59:
- seconds = 0
- minutes = minutes + 1
- if minutes > 59:
- minutes = 0
- hours = hours + 1
- if hours > 23:
- hours = 0
- days = days + 1
-
- return datetime(year,month,int(days),int(hours),int(minutes),int(seconds),-1, int(dayofyr))
-
-def _dateparse(timestr):
- """parse a string of the form time-units since yyyy-mm-dd hh:mm:ss
- return a tuple (units, datetimeinstance)"""
- timestr_split = timestr.split()
- units = timestr_split[0].lower()
- if units not in _units:
- raise ValueError,"units must be one of 'seconds', 'minutes', 'hours' or 'days' (or singular version of these), got '%s'" % units
- if timestr_split[1].lower() != 'since':
- raise ValueError,"no 'since' in unit_string"
- # parse the date string.
- n = timestr.find('since')+6
- year,month,day,hour,minute,second,utc_offset = _parse_date(timestr[n:])
- return units, utc_offset, datetime(year, month, day, hour, minute, second)
-
-class utime:
- """
-Performs conversions of netCDF time coordinate
-data to/from datetime objects.
-
-To initialize: C{t = utime(unit_string,calendar='standard')}
-
-where
-
-B{C{unit_string}} is a string of the form
-C{'time-units since <time-origin>'} defining the time units.
-
-Valid time-units are days, hours, minutes and seconds (the singular forms
-are also accepted). An example unit_string would be C{'hours
-since 0001年01月01日 00:00:00'}.
-
-The B{C{calendar}} keyword describes the calendar used in the time calculations.
-All the values currently defined in the U{CF metadata convention
-<http://cf-pcmdi.llnl.gov/documents/cf-conventions/1.1/cf-conventions.html#time-coordinate>}
-are accepted. The default is C{'standard'}, which corresponds to the mixed
-Gregorian/Julian calendar used by the C{udunits library}. Valid calendars
-are:
-
-C{'gregorian'} or C{'standard'} (default):
-
-Mixed Gregorian/Julian calendar as defined by udunits.
-
-C{'proleptic_gregorian'}:
-
-A Gregorian calendar extended to dates before 1582年10月15日. That is, a year
-is a leap year if either (i) it is divisible by 4 but not by 100 or (ii)
-it is divisible by 400.
-
-C{'noleap'} or C{'365_day'}:
-
-Gregorian calendar without leap years, i.e., all years are 365 days long.
-all_leap or 366_day Gregorian calendar with every year being a leap year,
-i.e., all years are 366 days long.
-
-C{'360_day'}:
-
-All years are 360 days divided into 30 day months.
-
-C{'julian'}:
-
-Proleptic Julian calendar, extended to dates after 1582年10月5日. A year is a
-leap year if it is divisible by 4.
-
-The C{L{num2date}} and C{L{date2num}} class methods can used to convert datetime
-instances to/from the specified time units using the specified calendar.
-
-The datetime instances returned by C{num2date} are 'real' python datetime
-objects if the date falls in the Gregorian calendar (i.e.
-C{calendar='proleptic_gregorian', 'standard'} or C{'gregorian'} and
-the date is after 1582年10月15日). Otherwise, they are 'phony' datetime
-objects which are actually instances of C{L{netcdftime.datetime}}. This is
-because the python datetime module cannot handle the weird dates in some
-calendars (such as C{'360_day'} and C{'all_leap'}) which don't exist in any real
-world calendar.
-
-
-Example usage:
-
->>> from netcdftime import utime
->>> from datetime import datetime
->>> cdftime = utime('hours since 0001年01月01日 00:00:00')
->>> date = datetime.now()
->>> print date
-2006年03月17日 16:04:02.561678
->>>
->>> t = cdftime.date2num(date)
->>> print t
-17577328.0672
->>>
->>> date = cdftime.num2date(t)
->>> print date
-2006年03月17日 16:04:02
->>>
-
-The resolution of the transformation operation is 1 second.
-
-Warning: Dates between 1582年10月5日 and 1582年10月15日 do not exist in the
-C{'standard'} or C{'gregorian'} calendars. An exception will be raised if you pass
-a 'datetime-like' object in that range to the C{L{date2num}} class method.
-
-Words of Wisdom from the British MetOffice concerning reference dates:
-
-"udunits implements the mixed Gregorian/Julian calendar system, as
-followed in England, in which dates prior to 1582年10月15日 are assumed to use
-the Julian calendar. Other software cannot be relied upon to handle the
-change of calendar in the same way, so for robustness it is recommended
-that the reference date be later than 1582. If earlier dates must be used,
-it should be noted that udunits treats 0 AD as identical to 1 AD."
-
-@ivar origin: datetime instance defining the origin of the netCDF time variable.
-@ivar calendar: the calendar used (as specified by the C{calendar} keyword).
-@ivar unit_string: a string defining the the netCDF time variable.
-@ivar units: the units part of C{unit_string} (i.e. 'days', 'hours', 'seconds').
- """
- def __init__(self,unit_string,calendar='standard'):
- """
-@param unit_string: a string of the form
-C{'time-units since <time-origin>'} defining the time units.
-
-Valid time-units are days, hours, minutes and seconds (the singular forms
-are also accepted). An example unit_string would be C{'hours
-since 0001年01月01日 00:00:00'}.
-
-@keyword calendar: describes the calendar used in the time calculations.
-All the values currently defined in the U{CF metadata convention
-<http://cf-pcmdi.llnl.gov/documents/cf-conventions/1.1/cf-conventions.html#time-coordinate>}
-are accepted. The default is C{'standard'}, which corresponds to the mixed
-Gregorian/Julian calendar used by the C{udunits library}. Valid calendars
-are:
- - C{'gregorian'} or C{'standard'} (default):
- Mixed Gregorian/Julian calendar as defined by udunits.
- - C{'proleptic_gregorian'}:
- A Gregorian calendar extended to dates before 1582年10月15日. That is, a year
- is a leap year if either (i) it is divisible by 4 but not by 100 or (ii)
- it is divisible by 400.
- - C{'noleap'} or C{'365_day'}:
- Gregorian calendar without leap years, i.e., all years are 365 days long.
- - C{'all_leap'} or C{'366_day'}:
- Gregorian calendar with every year being a leap year, i.e.,
- all years are 366 days long.
- -C{'360_day'}:
- All years are 360 days divided into 30 day months.
- -C{'julian'}:
- Proleptic Julian calendar, extended to dates after 1582年10月5日. A year is a
- leap year if it is divisible by 4.
-
-@returns: A class instance which may be used for converting times from netCDF
-units to datetime objects.
- """
- if calendar in _calendars:
- self.calendar = calendar
- else:
- raise ValueError, "calendar must be one of %s, got '%s'" % (str(_calendars),calendar)
- units, tzoffset, self.origin = _dateparse(unit_string)
- self.tzoffset = tzoffset # time zone offset in minutes
- self.units = units
- self.unit_string = unit_string
- if self.calendar in ['noleap','365_day'] and self.origin.month == 2 and self.origin.day == 29:
- raise ValueError, 'cannot specify a leap day as the reference time with the noleap calendar'
- if self.calendar == '360_day' and self.origin.day > 30:
- raise ValueError, 'there are only 30 days in every month with the 360_day calendar'
- if self.calendar in ['noleap','365_day']:
- self._jd0 = _NoLeapDayFromDate(self.origin)
- elif self.calendar in ['all_leap','366_day']:
- self._jd0 = _AllLeapFromDate(self.origin)
- elif self.calendar == '360_day':
- self._jd0 = _360DayFromDate(self.origin)
- else:
- self._jd0 = JulianDayFromDate(self.origin,calendar=self.calendar)
-
- def date2num(self,date):
- """
-Returns C{time_value} in units described by L{unit_string}, using
-the specified L{calendar}, given a 'datetime-like' object.
-
-The datetime object must represent UTC with no time-zone offset.
-If there is a time-zone offset implied by L{unit_string}, it will
-be applied to the returned numeric values.
-
-Resolution is 1 second.
-
-If C{calendar = 'standard'} or C{'gregorian'} (indicating
-that the mixed Julian/Gregorian calendar is to be used), an
-exception will be raised if the 'datetime-like' object describes
-a date between 1582年10月5日 and 1582年10月15日.
-
-Works for scalars, sequences and numpy arrays.
-Returns a scalar if input is a scalar, else returns a numpy array.
- """
- isscalar = False
- try:
- date[0]
- except:
- isscalar = True
- if not isscalar:
- date = numpy.array(date)
- shape = date.shape
- if self.calendar in ['julian','standard','gregorian','proleptic_gregorian']:
- if isscalar:
- jdelta = JulianDayFromDate(date,self.calendar)-self._jd0
- else:
- jdelta = [JulianDayFromDate(d,self.calendar)-self._jd0 for d in date.flat]
- elif self.calendar in ['noleap','365_day']:
- if date.month == 2 and date.day == 29:
- raise ValueError, 'there is no leap day in the noleap calendar'
- if isscalar:
- jdelta = _NoLeapDayFromDate(date) - self._jd0
- else:
- jdelta = [_NoLeapDayFromDate(d)-self._jd0 for d in date.flat]
- elif self.calendar in ['all_leap','366_day']:
- if isscalar:
- jdelta = _AllLeapFromDate(date) - self._jd0
- else:
- jdelta = [_AllLeapFromDate(d)-self._jd0 for d in date.flat]
- elif self.calendar == '360_day':
- if self.calendar == '360_day' and date.day > 30:
- raise ValueError, 'there are only 30 days in every month with the 360_day calendar'
- if isscalar:
- jdelta = _360DayFromDate(date) - self._jd0
- else:
- jdelta = [_360DayFromDate(d)-self._jd0 for d in date.flat]
- if not isscalar:
- jdelta = numpy.array(jdelta)
- # convert to desired units, add time zone offset.
- if self.units in ['second','seconds']:
- jdelta = jdelta*86400. + self.tzoffset*60.
- elif self.units in ['minute','minutes']:
- jdelta = jdelta*1440. + self.tzoffset
- elif self.units in ['hour','hours']:
- jdelta = jdelta*24. + self.tzoffset/60.
- elif self.units in ['day','days']:
- jdelta = jdelta + self.tzoffset/1440.
- if isscalar:
- return jdelta
- else:
- return numpy.reshape(jdelta,shape)
-
- def num2date(self,time_value):
- """
-Return a 'datetime-like' object given a C{time_value} in units
-described by L{unit_string}, using L{calendar}.
-
-dates are in UTC with no offset, even if L{unit_string} contains
-a time zone offset from UTC.
-
-Resolution is 1 second.
-
-Works for scalars, sequences and numpy arrays.
-Returns a scalar if input is a scalar, else returns a numpy array.
-
-The datetime instances returned by C{num2date} are 'real' python datetime
-objects if the date falls in the Gregorian calendar (i.e.
-C{calendar='proleptic_gregorian'}, or C{calendar = 'standard'/'gregorian'} and
-the date is after 1582年10月15日). Otherwise, they are 'phony' datetime
-objects which are actually instances of netcdftime.datetime. This is
-because the python datetime module cannot handle the weird dates in some
-calendars (such as C{'360_day'} and C{'all_leap'}) which
-do not exist in any real world calendar.
- """
- isscalar = False
- try:
- time_value[0]
- except:
- isscalar = True
- if not isscalar:
- time_value = numpy.array(time_value, dtype='d')
- shape = time_value.shape
- # convert to desired units, remove time zone offset.
- if self.units in ['second','seconds']:
- jdelta = time_value/86400. - self.tzoffset/1440.
- elif self.units in ['minute','minutes']:
- jdelta = time_value/1440. - self.tzoffset/1440.
- elif self.units in ['hour','hours']:
- jdelta = time_value/24. - self.tzoffset/1440.
- elif self.units in ['day','days']:
- jdelta = time_value - self.tzoffset/1440.
- jd = self._jd0 + jdelta
- if self.calendar in ['julian','standard','gregorian','proleptic_gregorian']:
- if not isscalar:
- date = [DateFromJulianDay(j,self.calendar) for j in jd.flat]
- else:
- date = DateFromJulianDay(jd,self.calendar)
- elif self.calendar in ['noleap','365_day']:
- if not isscalar:
- date = [_DateFromNoLeapDay(j) for j in jd.flat]
- else:
- date = _DateFromNoLeapDay(jd)
- elif self.calendar in ['all_leap','366_day']:
- if not isscalar:
- date = [_DateFromAllLeap(j) for j in jd.flat]
- else:
- date = _DateFromAllLeap(jd)
- elif self.calendar == '360_day':
- if not isscalar:
- date = [_DateFrom360Day(j) for j in jd.flat]
- else:
- date = _DateFrom360Day(jd)
- if isscalar:
- return date
- else:
- return numpy.reshape(numpy.array(date),shape)
-
-def _parse_date(origin):
- """Parses a date string and returns a tuple
- (year,month,day,hour,minute,second,utc_offset).
- utc_offset is in minutes.
-
- This function parses the 'origin' part of the time unit. It should be
- something like::
-
- 2004年11月03日 14:42:27.0 +2:00
-
- Lots of things are optional; just the date is mandatory.
-
- by Roberto De Almeida
-
- excerpted from coards.py - http://cheeseshop.python.org/pypi/coards/
- """
- # yyyy-mm-dd [hh:mm:ss[.s][ [+-]hh[:][mm]]]
- p = re.compile( r'''(?P<year>\d{1,4}) # yyyy
- - #
- (?P<month>\d{1,2}) # mm or m
- - #
- (?P<day>\d{1,2}) # dd or d
- #
- (?: # [optional time and timezone]
- \s #
- (?P<hour>\d{1,2}) # hh or h
- : #
- (?P<min>\d{1,2}) # mm or m
- (?:
- \:
- (?P<sec>\d{1,2}) # ss or s (optional)
- )?
- #
- (?: # [optional decisecond]
- \. # .
- (?P<dsec>\d) # s
- )? #
- (?: # [optional timezone]
- \s #
- (?P<ho>[+-]? # [+ or -]
- \d{1,2}) # hh or h
- :? # [:]
- (?P<mo>\d{2})? # [mm]
- )? #
- )? #
- $ # EOL
- ''', re.VERBOSE)
-
- m = p.match(origin.strip())
- if m:
- c = m.groupdict(0)
- # UTC offset.
- offset = int(c['ho'])*60 + int(c['mo'])
- return int(c['year']),int(c['month']),int(c['day']),int(c['hour']),int(c['min']),int(c['sec']),offset
-
- raise Exception('Invalid date origin: %s' % origin)
-
-# remove the unsupposed "%s" command. But don't
-# do it if there's an even number of %s before the s
-# because those are all escaped. Can't simply
-# remove the s because the result of
-# %sY
-# should be %Y if %s isn't supported, not the
-# 4 digit year.
-_illegal_s = re.compile(r"((^|[^%])(%%)*%s)")
-
-def _findall(text, substr):
- # Also finds overlaps
- sites = []
- i = 0
- while 1:
- j = text.find(substr, i)
- if j == -1:
- break
- sites.append(j)
- i=j+1
- return sites
-
-# Every 28 years the calendar repeats, except through century leap
-# years where it's 6 years. But only if you're using the Gregorian
-# calendar. ;)
-
-def _strftime(dt, fmt):
- if _illegal_s.search(fmt):
- raise TypeError("This strftime implementation does not handle %s")
- # don't use strftime method at all.
- #if dt.year > 1900:
- # return dt.strftime(fmt)
-
- year = dt.year
- # For every non-leap year century, advance by
- # 6 years to get into the 28-year repeat cycle
- delta = 2000 - year
- off = 6*(delta // 100 + delta // 400)
- year = year + off
-
- # Move to around the year 2000
- year = year + ((2000 - year)//28)*28
- timetuple = dt.timetuple()
- s1 = time.strftime(fmt, (year,) + timetuple[1:])
- sites1 = _findall(s1, str(year))
-
- s2 = time.strftime(fmt, (year+28,) + timetuple[1:])
- sites2 = _findall(s2, str(year+28))
-
- sites = []
- for site in sites1:
- if site in sites2:
- sites.append(site)
-
- s = s1
- syear = "%4d" % (dt.year,)
- for site in sites:
- s = s[:site] + syear + s[site+4:]
- return s
-
-def date2num(dates,units,calendar='standard'):
- """
-date2num(dates,units,calendar='standard')
-
-Return numeric time values given datetime objects. The units
-of the numeric time values are described by the L{units} argument
-and the L{calendar} keyword. The datetime objects must
-be in UTC with no time-zone offset. If there is a
-time-zone offset in C{units}, it will be applied to the
-returned numeric values.
-
-Like the matplotlib C{date2num} function, except that it allows
-for different units and calendars. Behaves the same if
-C{units = 'days since 0001年01月01日 00:00:00'} and
-C{calendar = 'proleptic_gregorian'}.
-
-@param dates: A datetime object or a sequence of datetime objects.
- The datetime objects should not include a time-zone offset.
-
-@param units: a string of the form C{'B{time units} since B{reference time}}'
- describing the time units. B{C{time units}} can be days, hours, minutes
- or seconds. B{C{reference time}} is the time origin. A valid choice
- would be units=C{'hours since 1800年01月01日 00:00:00 -6:00'}.
-
-@param calendar: describes the calendar used in the time calculations.
- All the values currently defined in the U{CF metadata convention
- <http://cf-pcmdi.llnl.gov/documents/cf-conventions/>} are supported.
- Valid calendars C{'standard', 'gregorian', 'proleptic_gregorian'
- 'noleap', '365_day', '360_day', 'julian', 'all_leap', '366_day'}.
- Default is C{'standard'}, which is a mixed Julian/Gregorian calendar.
-
-@return: a numeric time value, or an array of numeric time values.
-
-The maximum resolution of the numeric time values is 1 second.
- """
- cdftime = utime(units,calendar=calendar)
- return cdftime.date2num(dates)
-
-def num2date(times,units,calendar='standard'):
- """
-num2date(times,units,calendar='standard')
-
-Return datetime objects given numeric time values. The units
-of the numeric time values are described by the C{units} argument
-and the C{calendar} keyword. The returned datetime objects represent
-UTC with no time-zone offset, even if the specified
-C{units} contain a time-zone offset.
-
-Like the matplotlib C{num2date} function, except that it allows
-for different units and calendars. Behaves the same if
-C{units = 'days since 001-01-01 00:00:00'} and
-C{calendar = 'proleptic_gregorian'}.
-
-@param times: numeric time values. Maximum resolution is 1 second.
-
-@param units: a string of the form C{'B{time units} since B{reference time}}'
-describing the time units. B{C{time units}} can be days, hours, minutes
-or seconds. B{C{reference time}} is the time origin. A valid choice
-would be units=C{'hours since 1800年01月01日 00:00:00 -6:00'}.
-
-@param calendar: describes the calendar used in the time calculations.
-All the values currently defined in the U{CF metadata convention
-<http://cf-pcmdi.llnl.gov/documents/cf-conventions/>} are supported.
-Valid calendars C{'standard', 'gregorian', 'proleptic_gregorian'
-'noleap', '365_day', '360_day', 'julian', 'all_leap', '366_day'}.
-Default is C{'standard'}, which is a mixed Julian/Gregorian calendar.
-
-@return: a datetime instance, or an array of datetime instances.
-
-The datetime instances returned are 'real' python datetime
-objects if the date falls in the Gregorian calendar (i.e.
-C{calendar='proleptic_gregorian'}, or C{calendar = 'standard'} or C{'gregorian'}
-and the date is after 1582年10月15日). Otherwise, they are 'phony' datetime
-objects which support some but not all the methods of 'real' python
-datetime objects. This is because the python datetime module cannot
-the uses the C{'proleptic_gregorian'} calendar, even before the switch
-occured from the Julian calendar in 1582. The datetime instances
-do not contain a time-zone offset, even if the specified C{units}
-contains one.
- """
- cdftime = utime(units,calendar=calendar)
- return cdftime.num2date(times)
-
-def _check_index(indices, dates, nctime, calendar):
- """Return True if the time indices given correspond to the given dates,
- False otherwise.
-
- Parameters:
-
- indices : sequence of integers
- Positive integers indexing the time variable.
-
- dates : sequence of datetime objects
- Reference dates.
-
- nctime : netCDF Variable object
- NetCDF time object.
-
- calendar : string
- Calendar of nctime.
- """
- if (indices <0).any():
- return False
-
- if (indices >= nctime.shape[0]).any():
- return False
-
-# t = nctime[indices]
-# fancy indexing not available, fall back on this.
- t=[]
- for ind in indices:
- t.append(nctime[ind])
- return numpy.all( num2date(t, nctime.units, calendar) == dates)
-
-
-def date2index(dates, nctime, calendar=None, select='exact'):
- """
- date2index(dates, nctime, calendar=None, select='exact')
-
- Return indices of a netCDF time variable corresponding to the given dates.
-
- @param dates: A datetime object or a sequence of datetime objects.
- The datetime objects should not include a time-zone offset.
-
- @param nctime: A netCDF time variable object. The nctime object must have a
- C{units} attribute. The entries are assumed to be stored in increasing
- order.
-
- @param calendar: Describes the calendar used in the time calculation.
- Valid calendars C{'standard', 'gregorian', 'proleptic_gregorian'
- 'noleap', '365_day', '360_day', 'julian', 'all_leap', '366_day'}.
- Default is C{'standard'}, which is a mixed Julian/Gregorian calendar
- If C{calendar} is None, its value is given by C{nctime.calendar} or
- C{standard} if no such attribute exists.
-
- @param select: C{'exact', 'before', 'after', 'nearest'}
- The index selection method. C{exact} will return the indices perfectly
- matching the dates given. C{before} and C{after} will return the indices
- corresponding to the dates just before or just after the given dates if
- an exact match cannot be found. C{nearest} will return the indices that
- correpond to the closest dates.
- """
- # Setting the calendar.
-
- if calendar == None:
- calendar = getattr(nctime, 'calendar', 'standard')
-
- num = numpy.atleast_1d(date2num(dates, nctime.units, calendar))
-
- # Trying to infer the correct index from the starting time and the stride.
- # This assumes that the times are increasing uniformly.
- t0, t1 = nctime[:2]
- dt = t1 - t0
- index = numpy.array((num-t0)/dt, int)
-
- # Checking that the index really corresponds to the given date.
- # If the times do not correspond, then it means that the times
- # are not increasing uniformly and we try the bisection method.
- if not _check_index(index, dates, nctime, calendar):
-
- # Use the bisection method. Assumes the dates are ordered.
- import bisect
-
- index = numpy.array([bisect.bisect_left(nctime, n) for n in num], int)
-
- # Find the dates for which the match is not perfect.
- # Use list comprehension instead of the simpler `nctime[index]` since
- # not all time objects support numpy integer indexing (eg dap).
- ncnum = numpy.squeeze([nctime[i] for i in index])
- mismatch = numpy.nonzero(ncnum != num)[0]
-
- if select == 'exact':
- if len(mismatch) > 0:
- raise ValueError, 'Some dates not found.'
-
- elif select == 'before':
- index[mismatch] -= 1
-
- elif select == 'after':
- pass
-
- elif select == 'nearest':
- nearest_to_left = num[mismatch] < numpy.array( [nctime[i-1] + nctime[i] for i in index[mismatch]]) / 2.
- index[mismatch] = index[mismatch] - 1 * nearest_to_left
-
- else:
- raise ValueError("%s is not an option for the `select` argument."%select)
-
- # convert numpy scalars or single element arrays to python ints.
- return _toscalar(index)
-
-
-def _toscalar(a):
- if a.shape in [(),(1,)]:
- return a.item()
- else:
- return a
This was sent by the SourceForge.net collaborative development platform, the world's largest Open Source development site.
From: <js...@us...> - 2011年02月12日 13:54:40
Revision: 8976
 http://matplotlib.svn.sourceforge.net/matplotlib/?rev=8976&view=rev
Author: jswhit
Date: 2011年02月12日 13:54:34 +0000 (2011年2月12日)
Log Message:
-----------
no longer needed since NetCDFFile has been removed
Removed Paths:
-------------
 trunk/toolkits/basemap/lib/dap/
 trunk/toolkits/basemap/lib/httplib2/
This was sent by the SourceForge.net collaborative development platform, the world's largest Open Source development site.
From: <js...@us...> - 2011年02月12日 13:53:36
Revision: 8975
 http://matplotlib.svn.sourceforge.net/matplotlib/?rev=8975&view=rev
Author: jswhit
Date: 2011年02月12日 13:53:30 +0000 (2011年2月12日)
Log Message:
-----------
remove deprecated NetCDFFile
Modified Paths:
--------------
 trunk/toolkits/basemap/Changelog
 trunk/toolkits/basemap/lib/mpl_toolkits/basemap/__init__.py
 trunk/toolkits/basemap/setup.py
Modified: trunk/toolkits/basemap/Changelog
===================================================================
--- trunk/toolkits/basemap/Changelog	2011年02月12日 04:35:09 UTC (rev 8974)
+++ trunk/toolkits/basemap/Changelog	2011年02月12日 13:53:30 UTC (rev 8975)
@@ -1,3 +1,7 @@
+version 1.0.2 (not yet released)
+ * added lic_demo.py to examples (line integral convolution,
+ requires scikit.vectorplot).
+ * removed deprecated NetCDFFile.
 version 1.0.1 (svn revision 8967)
 * regenerated C source with cython 0.14.1.
 * added new "allsky" example from Tom Loredo.
Modified: trunk/toolkits/basemap/lib/mpl_toolkits/basemap/__init__.py
===================================================================
--- trunk/toolkits/basemap/lib/mpl_toolkits/basemap/__init__.py	2011年02月12日 04:35:09 UTC (rev 8974)
+++ trunk/toolkits/basemap/lib/mpl_toolkits/basemap/__init__.py	2011年02月12日 13:53:30 UTC (rev 8975)
@@ -4,10 +4,6 @@
 Contains the :class:`Basemap` class (which does most of the
 heavy lifting), and the following functions:
 
-:func:`NetCDFFile`: Read local and remote NetCDF datasets. Deprecated
-as of version 1.0.1 (will be removed in 1.0.2). Use netcdf4-python
-(http://netcdf4-python.googlecode.com) module instead.
-
 :func:`interp`: bilinear interpolation between rectilinear grids.
 
 :func:`maskoceans`: mask 'wet' points of an input array.
@@ -54,7 +50,7 @@
 else:
 basemap_datadir = os.sep.join([os.path.dirname(__file__), 'data'])
 
-__version__ = '1.0.1'
+__version__ = '1.0.2'
 
 # supported map projections.
 _projnames = {'cyl' : 'Cylindrical Equidistant',
@@ -3978,52 +3974,6 @@
 else:
 return corners
 
-def NetCDFFile(file, mode='r', maskandscale=True, cache=None, mmap=True,\
- username=None, password=None, verbose=False):
- """NetCDF File reader/writer. API is the same as Scientific.IO.NetCDF.
-
- If ``file`` is a URL that starts with `http`, it is assumed
- to be a remote OPenDAP dataset, and pydap is used
- to retrieve the data. Only the OPenDAP Array and Grid data
- types are recognized. If file does not start with `http`, it
- is assumed to be a local netCDF file and is read
- with scipy.io.netcdf. Both pydap and scipy.io.netcdf are written
- by Roberto De Almeida.
-
- Data will
- automatically be converted to and from masked arrays if the variable
- has either a ``missing_value`` or ``_FillValue`` attribute, and
- some data points are equal to the value specified by that attribute.
- In addition, variables that have the ``scale_factor`` and ``add_offset``
- attribute will automatically be converted to and from short integers.
- To suppress these automatic conversions, set the ``maskandscale``
- keyword to False.
-
- The keywords ``cache``, ``username``, ``password`` and ``verbose`` are only
- valid for remote OPenDAP datasets. ``username`` and ``password`` are used
- to access OPenDAP datasets that require authentication. ``verbose=True``
- will make the pydap client print out the URLs being accessed.
- ``cache`` is a location (a directory) for caching data, so that repeated
- accesses to the same URL avoid the network.
-
- The keyword ``mmap`` is only valid for local netCDF files. When
- ``mmap=True`` (default), the mmap module is used to access the data.
- This may be slow for very large netCDF variables.
- """
- import netcdf
- import warnings
- msg=dedent("""
- 
- NetCDFFile will be removed in 1.0.2, please use netcdf4-python 
- (http://netcdf4-python.googlecode.com) instead
- """)
- warnings.warn(msg,DeprecationWarning)
- if file.startswith('http'):
- return netcdf._RemoteFile(file,maskandscale=maskandscale,\
- cache=cache,username=username,password=password,verbose=verbose)
- else:
- return netcdf.netcdf_file(file,mode=mode,mmap=mmap,maskandscale=maskandscale)
-
 def num2date(times,units='days since 0001年01月01日 00:00:00',calendar='proleptic_gregorian'):
 """
 Return datetime objects given numeric time values. The units
Modified: trunk/toolkits/basemap/setup.py
===================================================================
--- trunk/toolkits/basemap/setup.py	2011年02月12日 04:35:09 UTC (rev 8974)
+++ trunk/toolkits/basemap/setup.py	2011年02月12日 13:53:30 UTC (rev 8975)
@@ -241,7 +241,7 @@
 
 setup(
 name = "basemap",
- version = "1.0.1",
+ version = "1.0.2",
 description = "Plot data on map projections with matplotlib",
 long_description = """
 An add-on toolkit for matplotlib that lets you plot data
This was sent by the SourceForge.net collaborative development platform, the world's largest Open Source development site.
From: <js...@us...> - 2011年02月12日 04:35:17
Revision: 8974
 http://matplotlib.svn.sourceforge.net/matplotlib/?rev=8974&view=rev
Author: jswhit
Date: 2011年02月12日 04:35:09 +0000 (2011年2月12日)
Log Message:
-----------
fix import
Modified Paths:
--------------
 trunk/toolkits/basemap/examples/lic_demo.py
Modified: trunk/toolkits/basemap/examples/lic_demo.py
===================================================================
--- trunk/toolkits/basemap/examples/lic_demo.py	2011年02月11日 21:04:16 UTC (rev 8973)
+++ trunk/toolkits/basemap/examples/lic_demo.py	2011年02月12日 04:35:09 UTC (rev 8974)
@@ -9,7 +9,7 @@
 import numpy as np
 import matplotlib.pyplot as plt
 try:
- from vectorplot import lic_internal
+ from scikits.vectorplot import lic_internal
 except ImportError:
 raise ImportError('need vectorplot scikit for this example')
 
This was sent by the SourceForge.net collaborative development platform, the world's largest Open Source development site.
From: <js...@us...> - 2011年02月11日 21:04:22
Revision: 8973
 http://matplotlib.svn.sourceforge.net/matplotlib/?rev=8973&view=rev
Author: jswhit
Date: 2011年02月11日 21:04:16 +0000 (2011年2月11日)
Log Message:
-----------
no regridding
Modified Paths:
--------------
 trunk/toolkits/basemap/examples/lic_demo.py
Modified: trunk/toolkits/basemap/examples/lic_demo.py
===================================================================
--- trunk/toolkits/basemap/examples/lic_demo.py	2011年02月11日 20:52:28 UTC (rev 8972)
+++ trunk/toolkits/basemap/examples/lic_demo.py	2011年02月11日 21:04:16 UTC (rev 8973)
@@ -15,27 +15,26 @@
 
 # H*wind data from http://www.aoml.noaa.gov/hrd/data_sub/wind.html
 ncfile = NetCDFFile('rita.nc')
-u = ncfile.variables['sfc_u'][0,:,:]
-v = ncfile.variables['sfc_v'][0,:,:]
+udat = ncfile.variables['sfc_u'][0,:,:]
+vdat = ncfile.variables['sfc_v'][0,:,:]
 lons1 = ncfile.variables['longitude'][:]
 lats1 = ncfile.variables['latitude'][:]
 lat0 = lats1[len(lats1)/2]; lon0 = lons1[len(lons1)/2]
-print lat0,lon0
 lons, lats = np.meshgrid(lons1,lats1)
 ncfile.close()
 
+
 fig = plt.figure(figsize=(8,8))
-m = Basemap(projection='stere',lat_0=lat0,lon_0=lon0,width=1.e6,height=1.e6,resolution='i')
-nxv = 301; nyv = 301; kernellen=41
-udat, vdat, xv, yv = m.transform_vector(u,v,lons1,lats1,nxv,nyv,returnxy=True)
+m = Basemap(projection='cyl',llcrnrlat=lats1[0],llcrnrlon=lons1[0],urcrnrlat=lats1[-1],urcrnrlon=lons1[-1],resolution='i')
+kernellen=31
 texture = np.random.rand(udat.shape[0],udat.shape[1]).astype(np.float32)
 kernel = np.sin(np.arange(kernellen)*np.pi/kernellen).astype(np.float32)
 image = lic_internal.line_integral_convolution(udat.astype(np.float32),\
 vdat.astype(np.float32), texture, kernel)
 im = m.imshow(image,plt.cm.gist_stern)
 m.drawcoastlines()
-m.drawmeridians(np.arange(0,360,2),labels=[0,0,0,1])
-m.drawparallels(np.arange(-30,30,2),labels=[1,0,0,0])
+m.drawmeridians(np.arange(-120,-60,2),labels=[0,0,0,1])
+m.drawparallels(np.arange(0,30,2),labels=[1,0,0,0])
 plt.title('Hurricane Rita flow field visualized with Line Integral Convolution',\
 fontsize=13)
 plt.show()
This was sent by the SourceForge.net collaborative development platform, the world's largest Open Source development site.
From: <js...@us...> - 2011年02月11日 20:52:34
Revision: 8972
 http://matplotlib.svn.sourceforge.net/matplotlib/?rev=8972&view=rev
Author: jswhit
Date: 2011年02月11日 20:52:28 +0000 (2011年2月11日)
Log Message:
-----------
use rita data from AOML for lic example.
Modified Paths:
--------------
 trunk/toolkits/basemap/MANIFEST.in
 trunk/toolkits/basemap/examples/lic_demo.py
Added Paths:
-----------
 trunk/toolkits/basemap/examples/rita.nc
Removed Paths:
-------------
 trunk/toolkits/basemap/examples/hurrearl.nc
Modified: trunk/toolkits/basemap/MANIFEST.in
===================================================================
--- trunk/toolkits/basemap/MANIFEST.in	2011年02月11日 20:29:37 UTC (rev 8971)
+++ trunk/toolkits/basemap/MANIFEST.in	2011年02月11日 20:52:28 UTC (rev 8972)
@@ -79,7 +79,7 @@
 include examples/nws_precip_conus_20061222.nc
 include examples/C02562.orog.nc
 include examples/ccsm_popgrid.nc
-include examples/hurrearl.nc
+include examples/rita.nc
 include examples/NetCDFFile_tst.py
 include examples/maskoceans.py
 include examples/README
Deleted: trunk/toolkits/basemap/examples/hurrearl.nc
===================================================================
(Binary files differ)
Modified: trunk/toolkits/basemap/examples/lic_demo.py
===================================================================
--- trunk/toolkits/basemap/examples/lic_demo.py	2011年02月11日 20:29:37 UTC (rev 8971)
+++ trunk/toolkits/basemap/examples/lic_demo.py	2011年02月11日 20:52:28 UTC (rev 8972)
@@ -13,14 +13,14 @@
 except ImportError:
 raise ImportError('need vectorplot scikit for this example')
 
-date = '2010090100'
-lat0=22.6; lon0=-69.2
-
-ncfile = NetCDFFile('hurrearl.nc')
-u = ncfile.variables['u10m'][:,:]
-v = ncfile.variables['v10m'][:,:]
-lons1 = ncfile.variables['lon'][:]
-lats1 = ncfile.variables['lat'][:]
+# H*wind data from http://www.aoml.noaa.gov/hrd/data_sub/wind.html
+ncfile = NetCDFFile('rita.nc')
+u = ncfile.variables['sfc_u'][0,:,:]
+v = ncfile.variables['sfc_v'][0,:,:]
+lons1 = ncfile.variables['longitude'][:]
+lats1 = ncfile.variables['latitude'][:]
+lat0 = lats1[len(lats1)/2]; lon0 = lons1[len(lons1)/2]
+print lat0,lon0
 lons, lats = np.meshgrid(lons1,lats1)
 ncfile.close()
 
@@ -36,6 +36,6 @@
 m.drawcoastlines()
 m.drawmeridians(np.arange(0,360,2),labels=[0,0,0,1])
 m.drawparallels(np.arange(-30,30,2),labels=[1,0,0,0])
-plt.title('Hurricane Earl flow field visualized with Line Integral Convolution',\
+plt.title('Hurricane Rita flow field visualized with Line Integral Convolution',\
 fontsize=13)
 plt.show()
Added: trunk/toolkits/basemap/examples/rita.nc
===================================================================
(Binary files differ)
Property changes on: trunk/toolkits/basemap/examples/rita.nc
___________________________________________________________________
Added: svn:mime-type
 + application/octet-stream
This was sent by the SourceForge.net collaborative development platform, the world's largest Open Source development site.
From: <js...@us...> - 2011年02月11日 20:29:43
Revision: 8971
 http://matplotlib.svn.sourceforge.net/matplotlib/?rev=8971&view=rev
Author: jswhit
Date: 2011年02月11日 20:29:37 +0000 (2011年2月11日)
Log Message:
-----------
remove un-needed imports
Modified Paths:
--------------
 trunk/toolkits/basemap/examples/lic_demo.py
Modified: trunk/toolkits/basemap/examples/lic_demo.py
===================================================================
--- trunk/toolkits/basemap/examples/lic_demo.py	2011年02月11日 19:30:12 UTC (rev 8970)
+++ trunk/toolkits/basemap/examples/lic_demo.py	2011年02月11日 20:29:37 UTC (rev 8971)
@@ -5,7 +5,7 @@
 from netCDF4 import Dataset as NetCDFFile
 except ImportError:
 from mpl_toolkits.basemap import NetCDFFile
-from mpl_toolkits.basemap import Basemap, cm, shiftgrid
+from mpl_toolkits.basemap import Basemap
 import numpy as np
 import matplotlib.pyplot as plt
 try:
@@ -26,10 +26,9 @@
 
 fig = plt.figure(figsize=(8,8))
 m = Basemap(projection='stere',lat_0=lat0,lon_0=lon0,width=1.e6,height=1.e6,resolution='i')
-nxv = 501; nyv = 501
+nxv = 301; nyv = 301; kernellen=41
 udat, vdat, xv, yv = m.transform_vector(u,v,lons1,lats1,nxv,nyv,returnxy=True)
 texture = np.random.rand(udat.shape[0],udat.shape[1]).astype(np.float32)
-kernellen=51
 kernel = np.sin(np.arange(kernellen)*np.pi/kernellen).astype(np.float32)
 image = lic_internal.line_integral_convolution(udat.astype(np.float32),\
 vdat.astype(np.float32), texture, kernel)
This was sent by the SourceForge.net collaborative development platform, the world's largest Open Source development site.
From: <js...@us...> - 2011年02月11日 19:30:20
Revision: 8970
 http://matplotlib.svn.sourceforge.net/matplotlib/?rev=8970&view=rev
Author: jswhit
Date: 2011年02月11日 19:30:12 +0000 (2011年2月11日)
Log Message:
-----------
add new Line Integral Convolution (LIC) example (requires vectorplot scikit)
Modified Paths:
--------------
 trunk/toolkits/basemap/MANIFEST.in
 trunk/toolkits/basemap/examples/README
 trunk/toolkits/basemap/examples/run_all.py
Added Paths:
-----------
 trunk/toolkits/basemap/examples/hurrearl.nc
 trunk/toolkits/basemap/examples/lic_demo.py
Modified: trunk/toolkits/basemap/MANIFEST.in
===================================================================
--- trunk/toolkits/basemap/MANIFEST.in	2011年02月10日 07:37:05 UTC (rev 8969)
+++ trunk/toolkits/basemap/MANIFEST.in	2011年02月11日 19:30:12 UTC (rev 8970)
@@ -75,9 +75,11 @@
 include examples/cities.shx
 include examples/show_colormaps.py
 include examples/plotprecip.py
+include examples/lic_demo.py
 include examples/nws_precip_conus_20061222.nc
 include examples/C02562.orog.nc
 include examples/ccsm_popgrid.nc
+include examples/hurrearl.nc
 include examples/NetCDFFile_tst.py
 include examples/maskoceans.py
 include examples/README
Modified: trunk/toolkits/basemap/examples/README
===================================================================
--- trunk/toolkits/basemap/examples/README	2011年02月10日 07:37:05 UTC (rev 8969)
+++ trunk/toolkits/basemap/examples/README	2011年02月11日 19:30:12 UTC (rev 8970)
@@ -141,3 +141,6 @@
 daynight.py shows how to shade the regions of a map where the sun has set.
 
 ploticos.py demonstrates plotting on unstructured grids.
+
+lic_demo.py shows how to use vectorplot scikit to visualize vector fields with
+Line Integral Convolutions (LIC).
Added: trunk/toolkits/basemap/examples/hurrearl.nc
===================================================================
(Binary files differ)
Property changes on: trunk/toolkits/basemap/examples/hurrearl.nc
___________________________________________________________________
Added: svn:mime-type
 + application/octet-stream
Added: trunk/toolkits/basemap/examples/lic_demo.py
===================================================================
--- trunk/toolkits/basemap/examples/lic_demo.py	 (rev 0)
+++ trunk/toolkits/basemap/examples/lic_demo.py	2011年02月11日 19:30:12 UTC (rev 8970)
@@ -0,0 +1,42 @@
+# example showing how to use Line Integral Convolution to visualize a vector
+# flow field (from Hurricane Earl). Produces something akin to streamlines.
+# Requires vectorplot scikit (http://scikits.appspot.com/vectorplot).
+try:
+ from netCDF4 import Dataset as NetCDFFile
+except ImportError:
+ from mpl_toolkits.basemap import NetCDFFile
+from mpl_toolkits.basemap import Basemap, cm, shiftgrid
+import numpy as np
+import matplotlib.pyplot as plt
+try:
+ from vectorplot import lic_internal
+except ImportError:
+ raise ImportError('need vectorplot scikit for this example')
+
+date = '2010090100'
+lat0=22.6; lon0=-69.2
+
+ncfile = NetCDFFile('hurrearl.nc')
+u = ncfile.variables['u10m'][:,:]
+v = ncfile.variables['v10m'][:,:]
+lons1 = ncfile.variables['lon'][:]
+lats1 = ncfile.variables['lat'][:]
+lons, lats = np.meshgrid(lons1,lats1)
+ncfile.close()
+
+fig = plt.figure(figsize=(8,8))
+m = Basemap(projection='stere',lat_0=lat0,lon_0=lon0,width=1.e6,height=1.e6,resolution='i')
+nxv = 501; nyv = 501
+udat, vdat, xv, yv = m.transform_vector(u,v,lons1,lats1,nxv,nyv,returnxy=True)
+texture = np.random.rand(udat.shape[0],udat.shape[1]).astype(np.float32)
+kernellen=51
+kernel = np.sin(np.arange(kernellen)*np.pi/kernellen).astype(np.float32)
+image = lic_internal.line_integral_convolution(udat.astype(np.float32),\
+ vdat.astype(np.float32), texture, kernel)
+im = m.imshow(image,plt.cm.gist_stern)
+m.drawcoastlines()
+m.drawmeridians(np.arange(0,360,2),labels=[0,0,0,1])
+m.drawparallels(np.arange(-30,30,2),labels=[1,0,0,0])
+plt.title('Hurricane Earl flow field visualized with Line Integral Convolution',\
+ fontsize=13)
+plt.show()
Property changes on: trunk/toolkits/basemap/examples/lic_demo.py
___________________________________________________________________
Added: svn:executable
 + *
Modified: trunk/toolkits/basemap/examples/run_all.py
===================================================================
--- trunk/toolkits/basemap/examples/run_all.py	2011年02月10日 07:37:05 UTC (rev 8969)
+++ trunk/toolkits/basemap/examples/run_all.py	2011年02月11日 19:30:12 UTC (rev 8970)
@@ -10,6 +10,7 @@
 test_files.remove('plotsst.py')
 test_files.remove('embedding_map_in_wx.py') # requires wx
 test_files.remove('plothighsandlows.py') # requires scipy
+test_files.remove('lic_demo.py')
 print test_files
 py_path = os.environ.get('PYTHONPATH')
 if py_path is None:
This was sent by the SourceForge.net collaborative development platform, the world's largest Open Source development site.
From: <ef...@us...> - 2011年02月10日 07:37:11
Revision: 8969
 http://matplotlib.svn.sourceforge.net/matplotlib/?rev=8969&view=rev
Author: efiring
Date: 2011年02月10日 07:37:05 +0000 (2011年2月10日)
Log Message:
-----------
LogNorm: speed-up by dealing with mask explicitly
Modified Paths:
--------------
 trunk/matplotlib/lib/matplotlib/colors.py
Modified: trunk/matplotlib/lib/matplotlib/colors.py
===================================================================
--- trunk/matplotlib/lib/matplotlib/colors.py	2011年02月10日 01:21:27 UTC (rev 8968)
+++ trunk/matplotlib/lib/matplotlib/colors.py	2011年02月10日 07:37:05 UTC (rev 8969)
@@ -921,9 +921,17 @@
 mask=mask)
 #result = (ma.log(result)-np.log(vmin))/(np.log(vmax)-np.log(vmin))
 # in-place equivalent of above can be much faster
- np.ma.log(result, result)
- result -= np.log(vmin)
- result /= (np.log(vmax) - np.log(vmin))
+ resdat = result.data
+ mask = result.mask
+ if mask is np.ma.nomask:
+ mask = (resdat <= 0)
+ else:
+ mask |= resdat <= 0
+ np.putmask(resdat, mask, 1)
+ np.log(resdat, resdat)
+ resdat -= np.log(vmin)
+ resdat /= (np.log(vmax) - np.log(vmin))
+ result = np.ma.array(resdat, mask=mask, copy=False)
 if is_scalar:
 result = result[0]
 return result
This was sent by the SourceForge.net collaborative development platform, the world's largest Open Source development site.
From: <ef...@us...> - 2011年02月10日 01:21:34
Revision: 8968
 http://matplotlib.svn.sourceforge.net/matplotlib/?rev=8968&view=rev
Author: efiring
Date: 2011年02月10日 01:21:27 +0000 (2011年2月10日)
Log Message:
-----------
bugfix: handle alpha correctly in colormapping with uint8
Modified Paths:
--------------
 trunk/matplotlib/lib/matplotlib/colors.py
Modified: trunk/matplotlib/lib/matplotlib/colors.py
===================================================================
--- trunk/matplotlib/lib/matplotlib/colors.py	2011年02月09日 17:57:14 UTC (rev 8967)
+++ trunk/matplotlib/lib/matplotlib/colors.py	2011年02月10日 01:21:27 UTC (rev 8968)
@@ -547,6 +547,8 @@
 if alpha is not None:
 alpha = min(alpha, 1.0) # alpha must be between 0 and 1
 alpha = max(alpha, 0.0)
+ if bytes:
+ alpha = int(alpha * 255)
 if (lut[-1] == 0).all():
 lut[:-1, -1] = alpha
 # All zeros is taken as a flag for the default bad
This was sent by the SourceForge.net collaborative development platform, the world's largest Open Source development site.
From: <js...@us...> - 2011年02月09日 17:57:20
Revision: 8967
 http://matplotlib.svn.sourceforge.net/matplotlib/?rev=8967&view=rev
Author: jswhit
Date: 2011年02月09日 17:57:14 +0000 (2011年2月09日)
Log Message:
-----------
add missing nc file to MANIFEST
Modified Paths:
--------------
 trunk/toolkits/basemap/Changelog
 trunk/toolkits/basemap/MANIFEST.in
Modified: trunk/toolkits/basemap/Changelog
===================================================================
--- trunk/toolkits/basemap/Changelog	2011年02月09日 07:45:18 UTC (rev 8966)
+++ trunk/toolkits/basemap/Changelog	2011年02月09日 17:57:14 UTC (rev 8967)
@@ -1,4 +1,4 @@
-version 1.0.1 (svn revision 8963)
+version 1.0.1 (svn revision 8967)
 * regenerated C source with cython 0.14.1.
 * added new "allsky" example from Tom Loredo.
 * added 'celestial' keyword - if True astronomical convention for
Modified: trunk/toolkits/basemap/MANIFEST.in
===================================================================
--- trunk/toolkits/basemap/MANIFEST.in	2011年02月09日 07:45:18 UTC (rev 8966)
+++ trunk/toolkits/basemap/MANIFEST.in	2011年02月09日 17:57:14 UTC (rev 8967)
@@ -69,7 +69,6 @@
 include examples/garp.py
 include examples/setwh.py
 include examples/ccsm_popgrid.py
-include examples/ccsm_popgrid.nc
 include examples/plot_tissot.py
 include examples/cities.dbf
 include examples/cities.shp
@@ -77,6 +76,8 @@
 include examples/show_colormaps.py
 include examples/plotprecip.py
 include examples/nws_precip_conus_20061222.nc
+include examples/C02562.orog.nc
+include examples/ccsm_popgrid.nc
 include examples/NetCDFFile_tst.py
 include examples/maskoceans.py
 include examples/README
This was sent by the SourceForge.net collaborative development platform, the world's largest Open Source development site.
From: <ef...@us...> - 2011年02月09日 07:45:24
Revision: 8966
 http://matplotlib.svn.sourceforge.net/matplotlib/?rev=8966&view=rev
Author: efiring
Date: 2011年02月09日 07:45:18 +0000 (2011年2月09日)
Log Message:
-----------
_image.cpp: speed up image_frombyte; patch by C. Gohlke
Non-contiguous arrays, such as those resulting from pan and zoom,
are handled efficiently without additional copying. This
noticeably speeds up zoom and pan on large images.
Modified Paths:
--------------
 trunk/matplotlib/src/_image.cpp
Modified: trunk/matplotlib/src/_image.cpp
===================================================================
--- trunk/matplotlib/src/_image.cpp	2011年02月09日 04:16:08 UTC (rev 8965)
+++ trunk/matplotlib/src/_image.cpp	2011年02月09日 07:45:18 UTC (rev 8966)
@@ -1085,7 +1085,7 @@
 Py::Object x = args[0];
 int isoutput = Py::Int(args[1]);
 
- PyArrayObject *A = (PyArrayObject *) PyArray_ContiguousFromObject(x.ptr(), PyArray_UBYTE, 3, 3);
+ PyArrayObject *A = (PyArrayObject *) PyArray_FromObject(x.ptr(), PyArray_UBYTE, 3, 3);
 if (A == NULL)
 {
 throw Py::ValueError("Array must have 3 dimensions");
@@ -1104,35 +1104,86 @@
 
 agg::int8u *arrbuf;
 agg::int8u *buffer;
+ agg::int8u *dstbuf;
 
 arrbuf = reinterpret_cast<agg::int8u *>(A->data);
 
 size_t NUMBYTES(imo->colsIn * imo->rowsIn * imo->BPP);
- buffer = new agg::int8u[NUMBYTES];
+ buffer = dstbuf = new agg::int8u[NUMBYTES];
 
 if (buffer == NULL) //todo: also handle allocation throw
 {
 throw Py::MemoryError("_image_module::frombyte could not allocate memory");
 }
 
- const size_t N = imo->rowsIn * imo->colsIn * imo->BPP;
- size_t i = 0;
- if (A->dimensions[2] == 4)
+ if PyArray_ISCONTIGUOUS(A)
 {
- memmove(buffer, arrbuf, N);
+ if (A->dimensions[2] == 4)
+ {
+ memmove(dstbuf, arrbuf, imo->rowsIn * imo->colsIn * 4);
+ }
+ else
+ {
+ size_t i = imo->rowsIn * imo->colsIn;
+ while (i--)
+ {
+ *dstbuf++ = *arrbuf++;
+ *dstbuf++ = *arrbuf++;
+ *dstbuf++ = *arrbuf++;
+ *dstbuf++ = 255;
+ }
+ }
 }
+ else if ((A->strides[1] == 4) && (A->strides[2] == 1))
+ {
+ const size_t N = imo->colsIn * 4;
+ const size_t stride = A->strides[0];
+ for (size_t rownum = 0; rownum < imo->rowsIn; rownum++)
+ {
+ memmove(dstbuf, arrbuf, N);
+ arrbuf += stride;
+ dstbuf += N;
+ }
+ }
+ else if ((A->strides[1] == 3) && (A->strides[2] == 1))
+ {
+ const size_t stride = A->strides[0] - imo->colsIn * 3;
+ for (size_t rownum = 0; rownum < imo->rowsIn; rownum++)
+ {
+ for (size_t colnum = 0; colnum < imo->colsIn; colnum++)
+ {
+ *dstbuf++ = *arrbuf++;
+ *dstbuf++ = *arrbuf++;
+ *dstbuf++ = *arrbuf++;
+ *dstbuf++ = 255;
+ }
+ arrbuf += stride;
+ }
+ }
 else
 {
- while (i < N)
+ PyArrayIterObject *iter;
+ iter = (PyArrayIterObject *)PyArray_IterNew((PyObject *)A);
+ if (A->dimensions[2] == 4)
 {
- memmove(buffer, arrbuf, 3);
- buffer += 3;
- arrbuf += 3;
- *buffer++ = 255;
- i += 4;
+ while (iter->index < iter->size) {
+ *dstbuf++ = *((unsigned char *)iter->dataptr);
+ PyArray_ITER_NEXT(iter);
+ }
 }
- buffer -= N;
- arrbuf -= imo->rowsIn * imo->colsIn;
+ else
+ {
+ while (iter->index < iter->size) {
+ *dstbuf++ = *((unsigned char *)iter->dataptr);
+ PyArray_ITER_NEXT(iter);
+ *dstbuf++ = *((unsigned char *)iter->dataptr);
+ PyArray_ITER_NEXT(iter);
+ *dstbuf++ = *((unsigned char *)iter->dataptr);
+ PyArray_ITER_NEXT(iter);
+ *dstbuf++ = 255;
+ }
+ }
+ Py_DECREF(iter);
 }
 
 if (isoutput)
This was sent by the SourceForge.net collaborative development platform, the world's largest Open Source development site.

Showing results of 5455

1 2 3 .. 219 > >> (Page 1 of 219)
Want the latest updates on software, tech news, and AI?
Get latest updates about software, tech news, and AI from SourceForge directly in your inbox once a month.
Thanks for helping keep SourceForge clean.
X





Briefly describe the problem (required):
Upload screenshot of ad (required):
Select a file, or drag & drop file here.
Screenshot instructions:

Click URL instructions:
Right-click on the ad, choose "Copy Link", then paste here →
(This may not be possible with some types of ads)

More information about our ad policies

Ad destination/click URL:

AltStyle によって変換されたページ (->オリジナル) /