SourceForge logo
SourceForge logo
Menu

matplotlib-devel — matplotlib developers

You can subscribe to this list here.

2003 Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
(1)
Nov
(33)
Dec
(20)
2004 Jan
(7)
Feb
(44)
Mar
(51)
Apr
(43)
May
(43)
Jun
(36)
Jul
(61)
Aug
(44)
Sep
(25)
Oct
(82)
Nov
(97)
Dec
(47)
2005 Jan
(77)
Feb
(143)
Mar
(42)
Apr
(31)
May
(93)
Jun
(93)
Jul
(35)
Aug
(78)
Sep
(56)
Oct
(44)
Nov
(72)
Dec
(75)
2006 Jan
(116)
Feb
(99)
Mar
(181)
Apr
(171)
May
(112)
Jun
(86)
Jul
(91)
Aug
(111)
Sep
(77)
Oct
(72)
Nov
(57)
Dec
(51)
2007 Jan
(64)
Feb
(116)
Mar
(70)
Apr
(74)
May
(53)
Jun
(40)
Jul
(519)
Aug
(151)
Sep
(132)
Oct
(74)
Nov
(282)
Dec
(190)
2008 Jan
(141)
Feb
(67)
Mar
(69)
Apr
(96)
May
(227)
Jun
(404)
Jul
(399)
Aug
(96)
Sep
(120)
Oct
(205)
Nov
(126)
Dec
(261)
2009 Jan
(136)
Feb
(136)
Mar
(119)
Apr
(124)
May
(155)
Jun
(98)
Jul
(136)
Aug
(292)
Sep
(174)
Oct
(126)
Nov
(126)
Dec
(79)
2010 Jan
(109)
Feb
(83)
Mar
(139)
Apr
(91)
May
(79)
Jun
(164)
Jul
(184)
Aug
(146)
Sep
(163)
Oct
(128)
Nov
(70)
Dec
(73)
2011 Jan
(235)
Feb
(165)
Mar
(147)
Apr
(86)
May
(74)
Jun
(118)
Jul
(65)
Aug
(75)
Sep
(162)
Oct
(94)
Nov
(48)
Dec
(44)
2012 Jan
(49)
Feb
(40)
Mar
(88)
Apr
(35)
May
(52)
Jun
(69)
Jul
(90)
Aug
(123)
Sep
(112)
Oct
(120)
Nov
(105)
Dec
(116)
2013 Jan
(76)
Feb
(26)
Mar
(78)
Apr
(43)
May
(61)
Jun
(53)
Jul
(147)
Aug
(85)
Sep
(83)
Oct
(122)
Nov
(18)
Dec
(27)
2014 Jan
(58)
Feb
(25)
Mar
(49)
Apr
(17)
May
(29)
Jun
(39)
Jul
(53)
Aug
(52)
Sep
(35)
Oct
(47)
Nov
(110)
Dec
(27)
2015 Jan
(50)
Feb
(93)
Mar
(96)
Apr
(30)
May
(55)
Jun
(83)
Jul
(44)
Aug
(8)
Sep
(5)
Oct
Nov
(1)
Dec
(1)
2016 Jan
Feb
Mar
(1)
Apr
May
Jun
(2)
Jul
Aug
(3)
Sep
(1)
Oct
(3)
Nov
Dec
2017 Jan
Feb
(5)
Mar
Apr
May
Jun
Jul
(3)
Aug
Sep
(7)
Oct
Nov
Dec
2018 Jan
Feb
Mar
Apr
May
Jun
Jul
(2)
Aug
Sep
Oct
Nov
Dec
S M T W T F S




1
2
(1)
3
(4)
4
5
6
(15)
7
(2)
8
(1)
9
(3)
10
11
12
(8)
13
(6)
14
(4)
15
(6)
16
(1)
17
18
(1)
19
(4)
20
21
22
(7)
23
(12)
24
(2)
25
(1)
26
(3)
27
28
(2)
29
(1)
30
(2)

Showing results of 86

<< < 1 2 3 4 > >> (Page 2 of 4)
From: Eric F. <ef...@ha...> - 2006年06月22日 20:05:19
I have commited a set of changes to _transforms, collections, quiver, 
contour, and numerix as part of a move toward taking advantage of the 
efficiency of numerix arrays in place of sequences of tuples. The 
changes are outlined very briefly in CHANGELOG and API_CHANGES.
Changes in clabel are hacks, and may make it less efficient rather than 
more, but I expect this to be temporary; I needed to simply make it work 
with the other changes.
Where breakage will occur is any place in user code that expects the 
collection segments or vertices to be lists of tuples and tries to 
append to the list, for example. I don't know of any way to make the 
move towards use of arrays without this problem cropping up; I hope it 
is considered tolerable.
Eric
From: John H. <jdh...@ac...> - 2006年06月22日 14:43:32
>>>>> "Edin" =3D=3D Edin Salkovi=A7 <edi...@gm...> writes:
 Edin> I finally solved the problem of automaticaly generating the
 Edin> dicts for unicode <-> TeX conversion. This is the first step
 Edin> in enabling unicode support in mathtext.
Excellent.=20
 Edin> The STIX projects is usefull after all ;) They keep a nice
 Edin> table of Unicode symbols at:
 Edin> http://www.ams.org/STIX/bnb/stix-tbl.ascii-2005年09月24日
 Edin> Any comments about the script are appreciated :). Now I'll
Since you asked :-)
I may not have mentioned this but the style conventions for mpl code
are=20
 functions : lower or lower_score_separated
 variables and attributes : lower or lowerUpper
 classes : Upper or MixedUpper
Also, I am not too fond of the dict of dicts -- why not use variable
names? Here is my version
 import pickle
 fname =3D 'stix-tbl.ascii-2005年09月24日'
 uni2type1 =3D dict()
 type12uni =3D dict()
 uni2tex =3D dict()
 tex2uni =3D dict()
 for line in file(fname):
 if line[:2]!=3D' 0': continue # using continue avoids unneccesary=
 indent
 uninum =3D line[2:6].strip().lower()
 type1name =3D line[12:37].strip()
 texname =3D line[83:110].strip()
 uninum =3D int(uninum, 16)
 if type1name:
 uni2type1[uninum] =3D type1name
 type12uni[type1name] =3D uninum
 if texname:
 uni2tex[uninum] =3D texname
 tex2uni[texname] =3D uninum
 pickle.dump((uni2type1, type12uni, uni2tex, tex2uni), file('unitex.pc=
l','w'))
 # An example
 unichar =3D int('00d7', 16)
 print uni2tex.get(unichar)
 print uni2type1.get(unichar)
Also, I am a little hesitant to use pickle files for the final
mapping. I suggest you write a script that generates the python code
contains the dictionaries you need (that is how much of _mathext_data
was generated.
Thanks,
JDH
From: <edi...@gm...> - 2006年06月22日 13:51:43
I finally solved the problem of automaticaly generating the dicts for
unicode <-> TeX conversion. This is the first step in enabling unicode
support in mathtext.
The STIX projects is usefull after all ;) They keep a nice table of
Unicode symbols at:
http://www.ams.org/STIX/bnb/stix-tbl.ascii-2005年09月24日
Any comments about the script are appreciated :). Now I'll dig a bit
deeper into the font classes to fix them to suport unicode.
'''A script for seemlesly copying the data from the stix-tbl.ascii*
file to a set
of python dicts. Dicts are then pickled to coresponding files, for
later retrieval.
Currently used table file:
http://www.ams.org/STIX/bnb/stix-tbl.ascii-2005年09月24日
'''
import pickle
table_filename = 'stix-tbl.ascii-2005年09月24日'
dict_names = ['uni2type1', 'type12uni', 'uni2tex', 'tex2uni']
dicts = {}
# initialize the dicts
for name in dict_names:
 dicts[name] = {}
for line in file(table_filename):
 if line[:2]==' 0':
 uni_num = eval("u'\\u"+line[2:6].strip().lower()+"'")
 type1_name = line[12:37].strip()
 tex_name = line[83:110].strip()
 if type1_name:
 dicts['uni2type1'][uni_num] = type1_name
 dicts['type12uni'][type1_name] = uni_num
 if tex_name:
 dicts['uni2tex'][uni_num] = tex_name
 dicts['tex2uni'][tex_name] = uni_num
for name in dict_names:
 pickle.dump(dicts[name], open(name + '.pcl','w'))
# An example
uni_char = u'\u00d7'
print dicts['uni2tex'][uni_char]
print dicts['uni2type1'][uni_char]
# Testing of results, testing; feel free to unquote
# _mathtext_data.py can be found in the matplolib dir
#~ from _mathtext_data import latex_to_bakoma
#~ supported = 0
#~ unsupported = 0
#~ for tex_symbol in latex_to_bakoma:
 #~ try:
 #~ print tex_symbol, dicts['tex2uni'][tex_symbol]
 #~ supported += 1
 #~ except KeyError:
 #~ unsupported += 1
 #~ pass
#~ print supported, unsupported
>>>>> "Martin" == Martin Spacek <sc...@ms...> writes:
Hey martin, thanks for all these changes.
 Martin> to inconsistent behaviour: barh() draws bars vertically
 Martin> centered on the y values (ala matlab 6.0), while bar()
 Martin> draws bars aligned according to their left edge (not ala
 Martin> matlab). I prefer the edge aligning behaviour. It's easy
 Martin> to convert from one behaviour to the other, but I had to
 Martin> duplicate all the error checking code before conversion,
 Martin> which bloated it back up.
Most people prefer the center aligning behavior, at least those who
complained on the list about bar, so when I wrote barh I adopted
this. I tried to fix bar in the process, but ended up running into
some bugs when I tested John Gill's table demo, and so left it as edge
aligned and haven't revisited it since. So my weak preference would
be to have the two functions consistent and center aligned, but he who
does the work usually gets the biggest vote. Maybe others can chime
in.
 Martin> And lastly... I find it odd that barh() has the width and
 Martin> bottom args (formerly x and y) in that order: barh(width,
 Martin> bottom). The general matlab convention is that the first
 Martin> argument is the positions, and the second arg is the
 Martin> values. So it would make more sense to me to have
 Martin> barh(bottom, width). That way, you could switch back and
 Martin> forth between bar() and barh() and get the expected
 Martin> behaviour without having to switch around the
 Martin> arguments. In fact, that's exactly how barh() in matlab 6
 Martin> interprets the first two arguments: arg1 is the vertical
 Martin> positions, and arg2 is the lengths of the bars at those
 Martin> positions. Same goes for matlab's bar() function. As it is
 Martin> now in matplotlib, the first and second arguments are
 Martin> interpreted differently for bar() and barh()
I was following the convention that the x arg goes first and the y
second, but I'm not wed to this. I don't mind breaking existing code
if this order seems more natural, and since we are mostly emulating
the matlab conventions in bar and barh, it makes some sense to strive
for consistency. Perhaps you could patch the CHANGELOG and
API_CHANGES file along with the rest which explains the changes.
JDH
Well, I seem to have really dove into this.
Here are 4 different patches against the latest svn of axes.py (rev 
2495). Note that the rest of my install is the 0.87.3 release (I had to 
copy over quiver.py to get the latest axes.py to work).
patch1 has the following changes to bar() and barh():
- fixed ignoring the rcParams['patch.facecolor'] for bar color: the 
default value for the color arg is now None, and the Patch class is left 
to handle fetching the rcparams['patch.facecolor']
- set default error bar color to None, so that errorbar() can handle 
fetching the rcParams['lines.color']
- added an edgecolor keyword arg
- left and height can now both be scalars in bar(), same goes for x and 
y in barh(). Previously, this raised a TypeError upon testing their 
lengths. Code that preventively checked for this in barh() (but not in 
bar()) has been removed.
- fixed a bug where patches would be cleared when error bars were 
plotted if rcParams['axes.hold'] was False
- it looks like the code for barh() was copied from bar(), with some of 
the args renamed. There was an error in the color checking code in 
barh() where len(left) from bar() hadn't been properly renamed to len(x)
- found one or two changes that had been made to bar() that hadn't been 
propagated to barh(), or vice versa
- rearranged the order of some code segments so that they follow the 
order of the arguments
- updated the docstrings
Hopefully I haven't introduced any new bugs.
patch2 has everything in patch1, except it removes some code duplication 
by calling bar() from within barh(). I thought this would be a good 
idea, since it's easy to make a change in bar() and forget to do the 
same in barh(). It turns out that this takes up almost as many lines of 
code as having two independent functions, but this is only due to 
inconsistent behaviour: barh() draws bars vertically centered on the y 
values (ala matlab 6.0), while bar() draws bars aligned according to 
their left edge (not ala matlab). I prefer the edge aligning behaviour. 
It's easy to convert from one behaviour to the other, but I had to 
duplicate all the error checking code before conversion, which bloated 
it back up.
So... patch3 has everything in patch2, but renames the x and y args in 
barh() to width and bottom respectively. This makes barh() draw bars 
vertically aligned to their bottom edge, consistent with bar()'s 
behaviour. Also, this makes hist(orientation='horizontal') do the same, 
which makes it consistent with hist(orientation='vertical'). Finally, it 
removes the code bloat mentioned above. However, it'll break any 
existing code that relies on x or y as named args in barh(), or code 
that expects barh() bars to be vertically centered on their y values.
And lastly... I find it odd that barh() has the width and bottom args 
(formerly x and y) in that order: barh(width, bottom). The general 
matlab convention is that the first argument is the positions, and the 
second arg is the values. So it would make more sense to me to have 
barh(bottom, width). That way, you could switch back and forth between 
bar() and barh() and get the expected behaviour without having to switch 
around the arguments. In fact, that's exactly how barh() in matlab 6 
interprets the first two arguments: arg1 is the vertical positions, and 
arg2 is the lengths of the bars at those positions. Same goes for 
matlab's bar() function. As it is now in matplotlib, the first and 
second arguments are interpreted differently for bar() and barh()
I don't know if anyone agrees with this change, but patch4 has all of 
the changes in patch3, plus the order of the width and bottom args are 
switched in barh(). This of course will break existing code that depends 
on this order. I had to modify the barh() call in 
hist(orientation='horizontal') to reflect this. I couldn't find any 
other barh() call in matplotlib. For consistency, I also switched the 
order of the yerr and xerr args, but these have default values and are 
usually passed as keyword args, so this shouldn't break (much) code.
The patches are numbered in increasing order of preference. They look 
rather big (and I'm not sure if my file compare util is bug-free). If 
there seem to be problems with them, I can provide the full axes.py file 
that corresponds to each patch.
Cheers,
Martin
From: Eric F. <ef...@ha...> - 2006年06月19日 18:40:38
John Hunter wrote:
>>>>>>"Eric" == Eric Firing <ef...@ha...> writes:
> 
> 
> Eric> Based on a quick look, I think it would be easy to make
> Eric> LineCollection and PolyCollection accept a numerix array in
> Eric> place of [(x,y), (x,y), ...] for each line segment or
> Eric> polygon; specifically, this could replaced by an N x 2
> Eric> array, where the first column would be x and the second
> Eric> would be y. Backwards compatibility could be maintained
> Eric> easily. This would eliminate quite a bit of useless
> Eric> conversion back and forth among lists, tuples, and arrays.
> Eric> As it is, each sequence of sequences is converted to a pair
> Eric> of arrays in backend_bases, and typically it started out as
> Eric> either a 2-D numerix array or a pair of 1-D arrays in the
> Eric> code that is calling the collection constructor.
> 
> I think this is a useful enhancement. I would think that representing
> each segment as (x,y) where x and y are 1D arrays, might be slightly
> more natural than using an Nx2 but others may disagree.
John,
I have been working on this and I can probably commit something in the 
next few days. I have been pursuing the Nx2 representation for the 
following reasons:
1) It is highly compatible with the present sequence of tuples, so that 
the two representations can coexist peacefully:
a = [(1,2), (3,4), (5,6)] # present style
aa = numerix.array(a) # new style
In most places, a and aa work the same with no change to the code. The 
exception is where code does something like "a.append(b)". This occurs 
in the contour labelling code. I haven't fixed it yet, but I don't see 
any fundamental problem in doing so.
2) The Nx2 representation streamlines code because it involves one 2-D 
object, "XY", in place of two 1-D objects, X and Y. This also 
eliminates the need to check that the lengths of X and Y match. 
Logically, X and Y must go together, so why not keep them glued together 
in a single array?
Because of the compatibility, there is very little code that actually 
has to be changed to support the numerix array. There is a potential 
for breakage of user code, however. This is a concern. I don't know of 
any way of eliminating it entirely while retaining the efficiency 
benefits of using numerix arrays when possible. One thing that might 
help is to have the transform seq_xy_tups method handle both input 
forms, and return the form corresponding to the input. I can do this; I 
now have a transform method that handles both "a" and "aa", but 
presently it returns a numerix array in either case.
The optimization you describe below sounds good, but I want to finish 
stage 1, above, first.
Eric
> 
> How often does it come up that we want a homogeneous line collection,
> ie a bunch of lines segments with the same properties (color,
> linewidth...)? The most expensive part of the agg line collection
> renderer is probably the multiple calls to render_scanlines, which is
> necessary every time we change the linewidth or color. 
> 
> If all of the lines in a collection shared the same properties, we
> could draw the entire path with a combination of lineto/moveto, and
> just stroke and render it once (agg has an upper limit on path length
> though, since at some point I added the following to draw_lines
> 
> if ((i%10000)==0) {
> //draw the path in chunks
> _render_lines_path(path, gc);
> path.remove_all();
> path.move_to(thisx, thisy);
> }
> 
> Ie I render it every 10000 points. 
> 
> Actually, as I type this I realize the case of homogeneous lines (and
> polys) can be handled by the backend method "draw_path". One
> possibility is for the LineCollection to detect the homogeneous case
> len(linewidths)==1 and len(colors)==1 and call out to draw_path
> instead of draw_line_collection (the same could be done for a regular
> poly collection). Some extra extension code would probably be
> necessary to build the path efficiently from numerix arrays, and to
> handle the "chunking" problem to avoid extra long paths, but for
> certain special cases (scatters and quiver w/o color mapping) it would
> probably be a big win. The downside is that not all backend implement
> draw_paths, but the Collection front-end could detect this and fall
> back on the old approach if draw_paths is not implemented.
> 
> JDH
From: John H. <jdh...@ac...> - 2006年06月19日 13:13:55
>>>>> "Martin" == Martin Spacek <sc...@ms...> writes:
 Martin> Don't know if this is the best way, but here's a solution:
 Martin> def bar(self, left, height, width=0.8, bottom=0,
 Martin> color=matplotlib.rcParams['patch.facecolor'], yerr=None,
 Martin> xerr=None, ecolor=matplotlib.rcParams['patch.edgecolor'],
 Martin> capsize=3 ):
Hey Martin, 
We don't put the rc defaults in the function declaration because these
are evaluated only once, at module load time, which prevents users
from being able to change the defaults after the module is loaded. So
we use this idiom
def somefunc(edgecolor=None):
 if edgecolor is None: edgecolor = rcParams['patch.edgecolor']
If you'd like to submit a patch for bar and barh, that'd be great.
Thanks,
JDH
From: Martin S. <sc...@ms...> - 2006年06月19日 10:30:29
I've noticed that the rcparams settings for patch.facecolor and
patch.endcolor is ignored by bar() and barh() (and therefore hist()),
always displaying as blue and black, respectively. Is this intentional? 
I'm running matplotlib 0.87.3
The culprit:
def bar(self, left, height, width=0.8, bottom=0,
 color='b', yerr=None, xerr=None, ecolor='k', capsize=3
 ):
Don't know if this is the best way, but here's a solution:
def bar(self, left, height, width=0.8, bottom=0,
 color=matplotlib.rcParams['patch.facecolor'],
 yerr=None, xerr=None,
 ecolor=matplotlib.rcParams['patch.edgecolor'], capsize=3
 ):
Similar situation for barh()
Cheers,
Martin
From: <Jan...@ga...> - 2006年06月19日 06:37:55
Hi - this is my first post to such a list, so bear with me.=20
=20
I've just installed mpl3d and have had success with the examples shown =
at
http://www.scipy.org/Cookbook/Matplotlib/mplot3D
=20
We currently don't have numpy installed and using the older Numeric, so =
I
used the following instead:
=20
N =3D 100
x =3D zeros((N,N),Float)
y =3D zeros((N,N),Float)
z =3D zeros((N,N),Float)
u =3D arange(0,2*pi,2.*pi/N)
v =3D arange(0,2*pi,2.*pi/N)
=20
for i in range(N):
 for j in range(N):
 x[i,j] =3D cos(u[i])*sin(v[j])
 y[i,j] =3D sin(u[i])*sin(v[j])
 z[i,j] =3D cos(v[j])
=20
fig=3Dp.figure()
ax =3D p3.Axes3D(fig)
ax.plot_surface(x,y,z)
ax.set_xlabel('X')
ax.set_ylabel('Y')
ax.set_zlabel('Z')
fig.add_axes(ax)
p.show()
p.savefig('surfacetest')
p.close()
=20
which worked a treat (apart from the figure not closing on the first =
instance
...).
=20
However, if I change N to 10, I get the following error message:
=20
Traceback (most recent call last):
 File "test.py", line 47, in ?
 ax.plot_surface(x,y,z)
 File "c:\Python24\lib\site-packages\mpl3d\mplot3d.py, line 921, in
plot_surface
 norm =3D normalize(min(shade),max(shade))
ValueError: min() arg is an empty sequence
=20
It seems that if the number of columns or rows is less than 20 than =
rstride
and cstride =3D 0. This means that the boxes required to make the =
polygons in
the surface plot won't be constructed. However, you can get a 3D plot if =
you
use plot_wireframe or plot3D instead with N =3D 10 (but these plots =
aren't
quite as nice as the surface plot would be).
=20
Is there a minimum size of the arrays which plot_surface will work on? =
Is
there a workaround for smaller examples? I'm looking at plotting a =
(smallish)
number of time series solutions as a surface.
=20
Cheers, Jane.
=20
Dr Jane Sexton
Risk Research Group
Geospatial and Earth Monitoring Division
Geoscience Australia
From: Gary R. <gr...@bi...> - 2006年06月18日 00:41:48
Hi Edin,
Edin Salković wrote:
> Hi all,
> 
<snip>
> Also, if anyone has some good online sources about parsing etc. on the
> net, I vwould realy appreciate it.
Everything that David Mertz wrote about text processing in his excellent 
"Charming Python" articles:
<http://gnosis.cx/publish/tech_index_cp.html>
and "Building Recursive Descent Parsers with Python":
http://www.onlamp.com/pub/a/python/2006/01/26/pyparsing.html
If you were after a book on the subject, Mertz's book "Text Processing 
in Python" <http://gnosis.cx/TPiP/> would be an obvious choice or you 
could pick up any book about writing compilers.
Gary R.
From: Helge A. <av...@bc...> - 2006年06月16日 05:26:06
On 6/15/06, John Hunter <jdh...@ac...> wrote:
> How often does it come up that we want a homogeneous line collection,
> ie a bunch of lines segments with the same properties (color,
> linewidth...)?
Hi,
for b&w PS publication quality plotting, this must be a common thing to draw;
contour lines, vectors, xy plots, the axes, tick marks, even fonts
can all be constructed from disjoint line segments, no?
if matplotlib could pass numerix arrays more or less directly to gtk it could
perhaps also become the speed king of plotting packages :)
Helge
From: <edi...@gm...> - 2006年06月15日 22:28:39
SGkgYWxsLAoKSXMgaXQgdGhhdCB0aGUgY29kZSBpbiB0aGUgbWF0aHRleHQgbW9kdWxlIGxvb2tz
IHVnbHkgb3IgaXMgaXQganVzdCBtZQpub3QgdW5kZXJzdGFuZGluZyBpdD8KQWxzbywgaWYgYW55
b25lIGhhcyBzb21lIGdvb2Qgb25saW5lIHNvdXJjZXMgYWJvdXQgcGFyc2luZyBldGMuIG9uIHRo
ZQpuZXQsIEkgdndvdWxkIHJlYWx5IGFwcHJlY2lhdGUgaXQuCgpDb25zaWRlcmluZyB0aGUgZm9s
b3dpbmcgY29kZSAocGlja2VkIG9uIHJhbmRvbSwgZnJvbSBtYXRodGV4dC5weSkKCj09PQpkZWYg
bWF0aF9wYXJzZV9zX2Z0MmZvbnQocywgZHBpLCBmb250c2l6ZSwgYW5nbGU9MCk6CiAgICAiIiIK
ICAgIFBhcnNlIHRoZSBtYXRoIGV4cHJlc3Npb24gcywgcmV0dXJuIHRoZSAoYmJveCwgZm9udHMp
IHR1cGxlIG5lZWRlZAogICAgdG8gcmVuZGVyIGl0LgoKICAgIGZvbnRzaXplIG11c3QgYmUgaW4g
cG9pbnRzCgogICAgcmV0dXJuIGlzIHdpZHRoLCBoZWlnaHQsIGZvbnRzCiAgICAiIiIKCiAgICBt
YWpvciwgbWlub3IxLCBtaW5vcjIsIHRtcCwgdG1wID0gc3lzLnZlcnNpb25faW5mbwogICAgaWYg
bWFqb3I9PTIgYW5kIG1pbm9yMT09MjoKICAgICAgICByYWlzZSBTeXN0ZW1FeGl0KCdtYXRodGV4
dCBicm9rZW4gb24gcHl0aG9uMi4yLiAgV2UgaG9wZSB0bwpnZXQgdGhpcyBmaXhlZCBzb29uJykK
CiAgICBjYWNoZUtleSA9IChzLCBkcGksIGZvbnRzaXplLCBhbmdsZSkKICAgIHMgPSBzWzE6LTFd
ICAjIHN0cmlwIHRoZSAkIGZyb20gZnJvbnQgYW5kIGJhY2sKICAgIGlmIG1hdGhfcGFyc2Vfc19m
dDJmb250LmNhY2hlLmhhc19rZXkoY2FjaGVLZXkpOgogICAgICAgIHcsIGgsIGJmb250cyA9IG1h
dGhfcGFyc2Vfc19mdDJmb250LmNhY2hlW2NhY2hlS2V5XQogICAgICAgIHJldHVybiB3LCBoLCBi
Zm9udHMuZm9udHMudmFsdWVzKCkKCiAgICBiYWtvbWFGb250cyA9IEJha29tYVRydWVUeXBlRm9u
dHMoKQogICAgRWxlbWVudC5mb250cyA9IGJha29tYUZvbnRzCiAgICBoYW5kbGVyLmNsZWFyKCkK
ICAgIGV4cHJlc3Npb24ucGFyc2VTdHJpbmcoIHMgKQoKICAgIGhhbmRsZXIuZXhwci5zZXRfc2l6
ZV9pbmZvKGZvbnRzaXplLCBkcGkpCgogICAgIyBzZXQgdGhlIG9yaWdpbiBvbmNlIHRvIGFsbG93
IHcsIGggY29tcHV0aW9uCiAgICBoYW5kbGVyLmV4cHIuc2V0X29yaWdpbigwLCAwKQogICAgeG1p
biA9IG1pbihbZS54bWluKCkgZm9yIGUgaW4gaGFuZGxlci5zeW1ib2xzXSkKICAgIHhtYXggPSBt
YXgoW2UueG1heCgpIGZvciBlIGluIGhhbmRsZXIuc3ltYm9sc10pCiAgICB5bWluID0gbWluKFtl
LnltaW4oKSBmb3IgZSBpbiBoYW5kbGVyLnN5bWJvbHNdKQogICAgeW1heCA9IG1heChbZS55bWF4
KCkgZm9yIGUgaW4gaGFuZGxlci5zeW1ib2xzXSkKCiAgICAjIG5vdyBzZXQgdGhlIHRydWUgb3Jp
Z2luIC0gZG9lc24ndCBhZmZlY3Qgd2l0aCBhbmQgaGVpZ2h0CiAgICB3LCBoID0gIHhtYXgteG1p
biwgeW1heC15bWluCiAgICAjIGEgc21hbGwgcGFkIGZvciB0aGUgY2FudmFzIHNpemUKICAgIHcg
Kz0gMgogICAgaCArPSAyCgogICAgaGFuZGxlci5leHByLnNldF9vcmlnaW4oMCwgaC15bWF4KQoK
ICAgIEVsZW1lbnQuZm9udHMuc2V0X2NhbnZhc19zaXplKHcsaCkKICAgIGhhbmRsZXIuZXhwci5y
ZW5kZXIoKQogICAgaGFuZGxlci5jbGVhcigpCgogICAgbWF0aF9wYXJzZV9zX2Z0MmZvbnQuY2Fj
aGVbY2FjaGVLZXldID0gdywgaCwgYmFrb21hRm9udHMKICAgIHJldHVybiB3LCBoLCBiYWtvbWFG
b250cy5mb250cy52YWx1ZXMoKQoKbWF0aF9wYXJzZV9zX2Z0MmZvbnQuY2FjaGUgPSB7fQo9PT09
CgpJIGRvbid0IHVuZGVyc3RhbmQsIGZvciBleGFtcGxlLCB3aGF0IGRvZXMgdGhlIHN0YXRlbWVu
dDoKCmV4cHJlc3Npb24ucGFyc2VTdHJpbmcoIHMgKQoKZG8/CgoiZXhwcmVzc2lvbiIgaXMgZGVm
aW5lZCBnbG9iYWx5LCBhbmQgaXMgY2FsbGVkICh0aGF0IGlzIC0gaXRzIG1ldGhvZCkKb25seSBv
bmNlIGluIHRoZSBhYm92ZSBkZWZpbml0aW9uIG9mIHRoZSBmdW5jdGlvbiwgYnV0IEkgZG9uJ3QK
dW5kZXJzdGFuZCAtIHdoYXQgZG9lcyB0aGF0IHBhcnRpY3VsYXIgbGluZSBkbz8hPwoKLS0tLS0t
ClJlZ2FyZGluZyB0aGUgdW5pY29kZSBzdXBwb3J0IGluIG1hdGh0ZXh0LCBtYXRodGV4dCBjdXJy
ZW50bHkgdXNlcyB0aGUKZm9sb3dpbmcgZGljdGlvbmFyeSBmb3IgZ2V0dGluZyB0aGUgZ2x5cGgg
aW5mbyBvdXQgb2YgdGhlIGZvbnQgZmlsZXM6CgpsYXRleF90b19iYWtvbWEgPSB7CgogICAgcidc
b2ludCcgICAgICAgICAgICAgICAgOiAoJ2NtZXgxMCcsICA0NSksCiAgICByJ1xiaWdvZG90JyAg
ICAgICAgICAgICA6ICgnY21leDEwJywgIDUwKSwKICAgIHInXGJpZ29wbHVzJyAgICAgICAgICAg
IDogKCdjbWV4MTAnLCAgNTUpLAogICAgcidcYmlnb3RpbWVzJyAgICAgICAgICAgOiAoJ2NtZXgx
MCcsICA1OSksCiAgICByJ1xzdW0nICAgICAgICAgICAgICAgICA6ICgnY21leDEwJywgIDUxKSwK
ICAgIHInXHByb2QnICAgICAgICAgICAgICAgIDogKCdjbWV4MTAnLCAgMjQpLAouLi4KfQoKSSBt
YW5hZ2VkIHRvIGJ1aWxkIHRoZSBmb2xsb3dpbmcgZGljdGlvbmFyeShsaXR0bGUgbW9yZSBsZWZ0
IHRvIGJlIGRvbmUpOgp0ZXhfdG9fdW5pY29kZSA9IHsKcidcUycgOiB1J1x1MDBhNycsCnInXFAn
IDogdSdcdTAwYjYnLApyJ1xHYW1tYScgOiB1J1x1MDM5MycsCnInXERlbHRhJyA6IHUnXHUwMzk0
JywKcidcVGhldGEnIDogdSdcdTAzOTgnLApyJ1xMYW1iZGEnIDogdSdcdTAzOWInLApyJ1xYaScg
OiB1J1x1MDM5ZScsCnInXFBpJyA6IHUnXHUwM2EwJywKcidcU2lnbWEnIDogdSdcdTAzYTMnLApy
J1xVcHNpbG9uJyA6IHUnXHUwM2E1JywKcidcUGhpJyA6IHUnXHUwM2E2JywKcidcUHNpJyA6IHUn
XHUwM2E4JywKcidcT21lZ2EnIDogdSdcdTAzYTknLApyJ1xhbHBoYScgOiB1J1x1MDNiMScsCnIn
XGJldGEnIDogdSdcdTAzYjInLApyJ1xnYW1tYScgOiB1J1x1MDNiMycsCnInXGRlbHRhJyA6IHUn
XHUwM2I0JywKcidcdmFyZXBzaWxvbicgOiB1J1x1MDNiNScsCnInXHpldGEnIDogdSdcdTAzYjYn
LApyJ1xldGEnIDogdSdcdTAzYjcnLApyJ1x2YXJ0aGV0YScgOiB1J1x1MDNiOCcsCnInXGlvdGEn
IDogdSdcdTAzYjknLApyJ1xrYXBwYScgOiB1J1x1MDNiYScsCnInXGxhbWJkYScgOiB1J1x1MDNi
YicsCnInXG11JyA6IHUnXHUwM2JjJywKcidcbnUnIDogdSdcdTAzYmQnLApyJ1x4aScgOiB1J1x1
MDNiZScsCnInXHBpJyA6IHUnXHUwM2MwJywKcidcdmFycmhvJyA6IHUnXHUwM2MxJywKcidcdmFy
c2lnbWEnIDogdSdcdTAzYzInLApyJ1xzaWdtYScgOiB1J1x1MDNjMycsCnInXHRhdScgOiB1J1x1
MDNjNCcsCnInXHVwc2lsb24nIDogdSdcdTAzYzUnLApyJ1x2YXJwaGknIDogdSdcdTAzYzYnLApy
J1xjaGknIDogdSdcdTAzYzcnLApyJ1xwc2knIDogdSdcdTAzYzgnLApyJ1xvbWVnYScgOiB1J1x1
MDNjOScsCnInXGVsbCcgOiB1J1x1MjExMycsCnInXHdwJyA6IHUnXHUyMTE4JywKcidcT21lZ2En
IDogdSdcdTIxMjYnLApyJ1xSZScgOiB1J1x1MjExYycsCnInXEltJyA6IHUnXHUyMTExJywKcidc
YWxlcGgnIDogdSdcdTA1ZDAnLApyJ1xhbGVwaCcgOiB1J1x1MjEzNScsCnInXHNwYWRlc3VpdCcg
OiB1J1x1MjY2MCcsCnInXGhlYXJ0c3VpdCcgOiB1J1x1MjY2MScsCnInXGRpYW1vbmRzdWl0JyA6
IHUnXHUyNjYyJywKcidcY2x1YnN1aXQnIDogdSdcdTI2NjMnLApyJ1xmbGF0JyA6IHUnXHUyNjZk
JywKcidcbmF0dXJhbCcgOiB1J1x1MjY2ZScsCnInXHNoYXJwJyA6IHUnXHUyNjZmJywKcidcbGVm
dGFycm93JyA6IHUnXHUyMTkwJywKcidcdXBhcnJvdycgOiB1J1x1MjE5MScsCnInXHJpZ2h0YXJy
b3cnIDogdSdcdTIxOTInLApyJ1xkb3duYXJyb3cnIDogdSdcdTIxOTMnLApyJ1xSaWdodGFycm93
JyA6IHUnXHUyMWQyJywKcidcTGVmdHJpZ2h0YXJyb3cnIDogdSdcdTIxZDQnLApyJ1xsZWZ0cmln
aHRhcnJvdycgOiB1J1x1MjE5NCcsCnInXHVwZG93bmFycm93JyA6IHUnXHUyMTk1JywKcidcZm9y
YWxsJyA6IHUnXHUyMjAwJywKcidcZXhpc3RzJyA6IHUnXHUyMjAzJywKcidcZW1wdHlzZXQnIDog
dSdcdTIyMDUnLApyJ1xEZWx0YScgOiB1J1x1MjIwNicsCnInXG5hYmxhJyA6IHUnXHUyMjA3JywK
cidcaW4nIDogdSdcdTIyMDgnLApyJ1xuaScgOiB1J1x1MjIwYicsCnInXHByb2QnIDogdSdcdTIy
MGYnLApyJ1xjb3Byb2QnIDogdSdcdTIyMTAnLApyJ1xzdW0nIDogdSdcdTIyMTEnLApyJy0nIDog
dSdcdTIyMTInLApyJ1xtcCcgOiB1J1x1MjIxMycsCnInLycgOiB1J1x1MjIxNScsCnInXGFzdCcg
OiB1J1x1MjIxNycsCnInXGNpcmMnIDogdSdcdTIyMTgnLApyJ1xidWxsZXQnIDogdSdcdTIyMTkn
LApyJ1xwcm9wdG8nIDogdSdcdTIyMWQnLApyJ1xpbmZ0eScgOiB1J1x1MjIxZScsCnInXG1pZCcg
OiB1J1x1MjIyMycsCnInXHdlZGdlJyA6IHUnXHUyMjI3JywKcidcdmVlJyA6IHUnXHUyMjI4JywK
cidcY2FwJyA6IHUnXHUyMjI5JywKcidcY3VwJyA6IHUnXHUyMjJhJywKcidcaW50JyA6IHUnXHUy
MjJiJywKcidcb2ludCcgOiB1J1x1MjIyZScsCnInOicgOiB1J1x1MjIzNicsCnInXHNpbScgOiB1
J1x1MjIzYycsCnInXHdyJyA6IHUnXHUyMjQwJywKcidcc2ltZXEnIDogdSdcdTIyNDMnLApyJ1xh
cHByb3gnIDogdSdcdTIyNDgnLApyJ1xhc3ltcCcgOiB1J1x1MjI0ZCcsCnInXGVxdWl2JyA6IHUn
XHUyMjYxJywKcidcbGVxJyA6IHUnXHUyMjY0JywKcidcZ2VxJyA6IHUnXHUyMjY1JywKcidcbGwn
IDogdSdcdTIyNmEnLApyJ1xnZycgOiB1J1x1MjI2YicsCnInXHByZWMnIDogdSdcdTIyN2EnLApy
J1xzdWNjJyA6IHUnXHUyMjdiJywKcidcc3Vic2V0JyA6IHUnXHUyMjgyJywKcidcc3Vwc2V0JyA6
IHUnXHUyMjgzJywKcidcc3Vic2V0ZXEnIDogdSdcdTIyODYnLApyJ1xzdXBzZXRlcScgOiB1J1x1
MjI4NycsCnInXHVwbHVzJyA6IHUnXHUyMjhlJywKcidcc3FzdWJzZXRlcScgOiB1J1x1MjI5MScs
CnInXHNxc3Vwc2V0ZXEnIDogdSdcdTIyOTInLApyJ1xzcWNhcCcgOiB1J1x1MjI5MycsCnInXHNx
Y3VwJyA6IHUnXHUyMjk0JywKcidcb3BsdXMnIDogdSdcdTIyOTUnLApyJ1xvbWludXMnIDogdSdc
dTIyOTYnLApyJ1xvdGltZXMnIDogdSdcdTIyOTcnLApyJ1xvc2xhc2gnIDogdSdcdTIyOTgnLApy
J1xvZG90JyA6IHUnXHUyMjk5JywKcidcdmRhc2gnIDogdSdcdTIyYTInLApyJ1xkYXNodicgOiB1
J1x1MjJhMycsCnInXHRvcCcgOiB1J1x1MjJhNCcsCnInXGJvdCcgOiB1J1x1MjJhNScsCnInXGJp
Z3dlZGdlJyA6IHUnXHUyMmMwJywKcidcYmlndmVlJyA6IHUnXHUyMmMxJywKcidcYmlnY2FwJyA6
IHUnXHUyMmMyJywKcidcYmlnY3VwJyA6IHUnXHUyMmMzJywKcidcZGlhbW9uZCcgOiB1J1x1MjJj
NCcsCnInXGNkb3QnIDogdSdcdTIyYzUnLApyJ1xsY2VpbCcgOiB1J1x1MjMwOCcsCnInXHJjZWls
JyA6IHUnXHUyMzA5JywKcidcbGZsb29yJyA6IHUnXHUyMzBhJywKcidccmZsb29yJyA6IHUnXHUy
MzBiJywKcidcbGFuZ2xlJyA6IHUnXHUyN2U4JywKcidccmFuZ2xlJyA6IHUnXHUyN2U5JywKcidc
ZGFnJyA6IHUnXHUyMDIwJywKcidcZGRhZycgOiB1J1x1MjAyMScsCn0KCnVuaWNvZGVfdG9fdGV4
IGlzIHN0cmFpZ2h0IGZvcndhcmQuCkFtIEkgb24gdGhlIHJpZ2h0IHRyYWNrPyBXaGF0IHNob3Vs
ZCBJIGRvIG5leHQ/CgpJIGFsc28gbm90aWNlZCB0aGF0IHNvbWUgVGVYIGNvbW1hbmRzIChjb21t
YW5kcyBpbiB0aGUgc2Vuc2UgdGhhdCB0aGV5CmNhbiBoYXZlIGFyZ3VtZW50cyBlbmNsb3NlZCBp
biBicmFja2V0cyB7fSkgYXJlIGRlZmluZWQgYXMgb25seQpzeW1ib2xzOiBcc3FydCBhbG9uZSwg
Zm9yIGV4YW1wbGUsIGRpc3BsYXlzIGp1c3QgdGhlIGJlZ2luaW5nIG9mIHRoZQpzcXVhcmUgcm9v
dDriiJosIGFuZCBcc3FydHsxMjN9IHRyaWdnZXJzIGFuIGVycm9yLgoKVGhhdCdzIGl0IGZvciBu
b3cKVGhhbmtzIGluIGFkdmFuY2UsCkVkaW4K
From: John H. <jdh...@ac...> - 2006年06月15日 21:14:03
>>>>> "Edin" =3D=3D Edin Salkovi=A7 <edi...@gm...> writes:
 Edin> Hi all, Is it that the code in the mathtext module looks
 Edin> ugly or is it just me not understanding it? Also, if anyone
 Edin> has some good online sources about parsing etc. on the net,
 Edin> I vwould realy appreciate it.
It's probably you not understanding it :-) In my opinion, the code is
pretty nice and modular, with a few exceptions, but I'm biased.
Parsers can be a little hard to understand at first. You might start
by trying to understand pyparsing
 http://pyparsing.wikispaces.com
and work through some of the basic examples there. Once you have your
head wrapped around that, it will get easier.
 Edin> Considering the foowing code (picked on random, from
 Edin> mathtext.py)
 Edin> I don't understand, for example, what does the statement:
 Edin> expression.parseString( s )
 Edin> do?
 Edin> "expression" is defined globaly, and is called (that is -
 Edin> its method) only once in the above definition of the
 Edin> function, but I don't understand - what does that particular
 Edin> line do?!?
It's not defined globally, but at module level. There is only one
expression that represents a TeX math expression (at least as far as
mathtext is concerned) so it is right that there is only one of them
at module level. It's like saying "a name is a first name followed by
an optional middle initial followed by a last name". You only need to
define this one, and then you set handlers to handle the different
components.
The expression assigns subexpressions to handlers. The statement
below says that an expression is one or more of a space, font element,
an accent, a symbol, a subscript, etc...
expression =3D OneOrMore(
 space ^ font ^ accent ^ symbol ^ subscript ^ superscript ^ subsupersc=
ript ^ group ^ composite ).setParseAction(handler.expression).setName("e=
xpression")
A subscript, for example, is a symbol group followed by an underscore
followed by a symbol group
subscript << Group( Optional(symgroup) + Literal('_') + symgroup )
and the handler is defined as
subscript =3D Forward().setParseAction(handler.subscript).setName("subscr=
ipt")
which means that the function handler.subscript will be called every
time the pattern is matched. The tokens will be the first symbol
group, the underscore, and the second symbol group. Here is the
implementation of that function
 def subscript(self, s, loc, toks):
 assert(len(toks)=3D=3D1)
 #print 'subsup', toks
 if len(toks[0])=3D=3D2:
 under, next =3D toks[0]
 prev =3D SpaceElement(0)
 else:
 prev, under, next =3D toks[0] =20
 if self.is_overunder(prev):
 prev.neighbors['below'] =3D next
 else:
 prev.neighbors['subscript'] =3D next
 return loc, [prev]
This grabs the tokens and assigns them to the names "prev" and "next".
Every element in the TeX expression is a special case of an Element,
and every Element has a dictionary mapping surrounding elements to
relative locations, either above or below or right or superscript or
subscript. The rest of this function takes the "next" element, and
assigns it either below (eg for \Sum_0円) or subscript (eg for x_0) and
the layout engine will then take this big tree and lay it out. See
for example the "set_origin" function?
 Edin> ------ Regarding the unicode s
upport in mathtext, mathtext
 Edin> currently uses the folowing dictionary for getting the glyph
 Edin> info out of the font files:
 Edin> latex_to_bakoma =3D {
 Edin> r'\oint' : ('cmex10', 45), r'\bigodot' : ('cmex10', 50),
 Edin> r'\bigoplus' : ('cmex10', 55), r'\bigotimes' : ('cmex10',
 Edin> 59), r'\sum' : ('cmex10', 51), r'\prod' : ('cmex10', 24),
 Edin> ...
 Edin> }
 Edin> I managed to build the following dictionary(little more left
 Edin> to be done): tex_to_unicode =3D { r'\S' : u'\u00a7', r'\P' :
 Edin> u'\u00b6', r'\Gamma' : u'\u0393', r'\Delta' : u'\u0394',
 Edin> r'\Theta' : u'\u0398', r'\Lambda' : u'\u039b', r'\Xi' :
 Edin> u'\u039e', r'\Pi' : u'\u03a0', r'\Sigma' : u'\u03a3',
 Edin> unicode_to_tex is straight forward. Am I on the right
 Edin> track? What should I do next?
Yes, this looks like the right approach. Once you have this
dictionary mostly working, you will need to try and make it work with
a set of unicode fonts. So instead of having the tex symbol point to
a file name and glyph index, you will need to parse a set of unicode
fonts to see which unicode symbols they provide and build a mapping
from unicode name -> file, glyph index. Then when you encounter a tex
symbol, you can use your tex_to_unicode dict combined with your
unicode -> filename, glyphindex dict to get the desired glyph.
 Edin> I also noticed that some TeX commands (commands in the sense
 Edin> that they can have arguments enclosed in brackets {}) are
 Edin> defined as only symbols: \sqrt alone, for example, displays
 Edin> just the begining of the square root:=BA, and \sqrt{123}
 Edin> triggers an error.
We don't have support for \sqrt{123} because we would need to do
something a little fancier (draw the horizontal line over 123). This
is doable and would be nice. To implement it, one approach would be
add some basic drawing functionality to the freetype module, eg to
tell freetype to draw a line on it's bitmap. Another approach would
simply be to grab the bitmap to freetype and pass it off to agg and
use the agg renderer to decorate it. This is probably preferable.
But I think this is a lower priority right now.
JDH
From: John H. <jdh...@ac...> - 2006年06月15日 14:03:18
>>>>> "Eric" == Eric Firing <ef...@ha...> writes:
 Eric> Based on a quick look, I think it would be easy to make
 Eric> LineCollection and PolyCollection accept a numerix array in
 Eric> place of [(x,y), (x,y), ...] for each line segment or
 Eric> polygon; specifically, this could replaced by an N x 2
 Eric> array, where the first column would be x and the second
 Eric> would be y. Backwards compatibility could be maintained
 Eric> easily. This would eliminate quite a bit of useless
 Eric> conversion back and forth among lists, tuples, and arrays.
 Eric> As it is, each sequence of sequences is converted to a pair
 Eric> of arrays in backend_bases, and typically it started out as
 Eric> either a 2-D numerix array or a pair of 1-D arrays in the
 Eric> code that is calling the collection constructor.
I think this is a useful enhancement. I would think that representing
each segment as (x,y) where x and y are 1D arrays, might be slightly
more natural than using an Nx2 but others may disagree.
How often does it come up that we want a homogeneous line collection,
ie a bunch of lines segments with the same properties (color,
linewidth...)? The most expensive part of the agg line collection
renderer is probably the multiple calls to render_scanlines, which is
necessary every time we change the linewidth or color. 
If all of the lines in a collection shared the same properties, we
could draw the entire path with a combination of lineto/moveto, and
just stroke and render it once (agg has an upper limit on path length
though, since at some point I added the following to draw_lines
 if ((i%10000)==0) {
 //draw the path in chunks
 _render_lines_path(path, gc);
 path.remove_all();
 path.move_to(thisx, thisy);
 }
Ie I render it every 10000 points. 
Actually, as I type this I realize the case of homogeneous lines (and
polys) can be handled by the backend method "draw_path". One
possibility is for the LineCollection to detect the homogeneous case
len(linewidths)==1 and len(colors)==1 and call out to draw_path
instead of draw_line_collection (the same could be done for a regular
poly collection). Some extra extension code would probably be
necessary to build the path efficiently from numerix arrays, and to
handle the "chunking" problem to avoid extra long paths, but for
certain special cases (scatters and quiver w/o color mapping) it would
probably be a big win. The downside is that not all backend implement
draw_paths, but the Collection front-end could detect this and fall
back on the old approach if draw_paths is not implemented.
JDH
From: Cyril G. <cyr...@fr...> - 2006年06月15日 11:37:10
Hello,
I use matplotlib 0.87.3 on win32 with wxpython.
 From the command line, the numeric package is set correctly : numpy 
(read from matplotlibrc) and all woks fine.
I try to use matplotlib from pyxpcom (the connector between xpcom and 
python in the mozilla world), it seems matplotlib doesn't read 
matplotlibrc since it looks for numeric package which is not installed.
I don't undestand, is there a reason the matplotlibrc file is not read, 
interpreted ?
thanks a lot,
Cyril.
From: Jordan D. <jdawe@u.washington.edu> - 2006年06月15日 00:22:04
Eric Firing wrote:
> Jordan,
>
> I understand what you wrote. I am a bit worried about the amount of 
> complexity it would add to the collection code, however, and it seems 
> like it would be useful only in quite special situations--and in those 
> situations, there may be reasonable alternatives. For example, the ps 
> backend uses nan as a flag or separator to skip drawing a line 
> segment; if all backends did this, then it would provide a more 
> general way to accomplish what you want to do.
>
> I will keep your idea in mind, but I want to start off with a much 
> simpler change.
I would tend to agree that nan entries would be a better idea than what 
I was talking about. I'll think about trying to modify the backend 
codes to support this behaviour, if they don't already.
Jordan
From: Jordan D. <jdawe@u.washington.edu> - 2006年06月15日 00:03:16
I have one suggestion, slightly off-topic, and I'm not sure how useful 
it would be: you might think about making LineCollection accept a 3-D 
numerix array. This came up for me while I was looking at turning the 
quiver arrows into line segments. As I understand it (and as the 
documentation says) LineCollection takes a set of a lines which are 
composed of continuous line segments, like:
segments = ( line0, line1, line2 )
linen = ( (x0,y0), (x1,y1), (x2, y2) )
I'd like an extra level of organization, like so:
linegroups = ( group0, group1, group2)
groupi = ( line0, line1, line2 )
linen = ( (x0,y0), (x1,y1), (x2, y2) )
Where "linegroups" would be the input to LineCollection. I assume it's 
fairly obvious how this turns into a rank 3 array.
The reason for this is that it allows for the drawing of non-continuous 
lines. This came up with the quiver arrow stuff because, as it stands, 
drawing a line-based arrow requires you to back-track over a previous 
line at least once. This created some rendering problems, where the 
back-tracked line was darker than the others, at least on the agg 
backend. This can be fixed by backtracking along every line in the 
arrow, so you are essentially drawing the arrow twice, but that seems 
inefficient. It is possible to draw 3 seperate line segments for each 
arrow, but then the colormapping no longer works; each line segment gets 
a different color, and the arrows look like a mess.
As I said, I don't know how useful this would be; it only comes up when 
drawing non-closed line segments that need to be addressed as a single 
object. Does what I wrote make sense?
Jordan
Eric Firing wrote:
> Based on a quick look, I think it would be easy to make LineCollection 
> and PolyCollection accept a numerix array in place of [(x,y), (x,y), 
> ...] for each line segment or polygon; specifically, this could replaced 
> by an N x 2 array, where the first column would be x and the second 
> would be y. Backwards compatibility could be maintained easily. This 
> would eliminate quite a bit of useless conversion back and forth among 
> lists, tuples, and arrays. As it is, each sequence of sequences is 
> converted to a pair of arrays in backend_bases, and typically it started 
> out as either a 2-D numerix array or a pair of 1-D arrays in the code 
> that is calling the collection constructor.
>
> Using a single 2-D array makes it easier to determine whether one is 
> dealing with 'old-style' inputs or 'new-style' inputs, but it might 
> still be reasonable to allow [X, Y] instead or in addition, where X and 
> Y are 1-D numerix arrays.
>
> Any objections or alternative suggestions?
>
> Eric
>
>
> _______________________________________________
> Matplotlib-devel mailing list
> Mat...@li...
> https://lists.sourceforge.net/lists/listinfo/matplotlib-devel
> 
From: Eric F. <ef...@ha...> - 2006年06月14日 23:34:08
Based on a quick look, I think it would be easy to make LineCollection 
and PolyCollection accept a numerix array in place of [(x,y), (x,y), 
...] for each line segment or polygon; specifically, this could replaced 
by an N x 2 array, where the first column would be x and the second 
would be y. Backwards compatibility could be maintained easily. This 
would eliminate quite a bit of useless conversion back and forth among 
lists, tuples, and arrays. As it is, each sequence of sequences is 
converted to a pair of arrays in backend_bases, and typically it started 
out as either a 2-D numerix array or a pair of 1-D arrays in the code 
that is calling the collection constructor.
Using a single 2-D array makes it easier to determine whether one is 
dealing with 'old-style' inputs or 'new-style' inputs, but it might 
still be reasonable to allow [X, Y] instead or in addition, where X and 
Y are 1-D numerix arrays.
Any objections or alternative suggestions?
Eric
From: Darren D. <dd...@co...> - 2006年06月14日 23:04:46
I'm making an errorbar plot with assymetric errorbars. The docstring says:
xerr and yerr may be any of:
 a rank-0, Nx1 Numpy array - symmetric errorbars +/- value
 an N-element list or tuple - symmetric errorbars +/- value
 a rank-1, Nx2 Numpy array - asymmetric errorbars -column1/+column2
I think that last line should read:
	a 2xN Numpy array - asymmetric errorbars -row1/+row2
Darren
From: Robert H. <he...@ta...> - 2006年06月14日 14:08:33
How about 'plots'? (i.e., a bunch of plot commands)
Perhaps this could also be the function that loops through multiple =20
lines. For example in matlab you can do
plot (x, y)
plot (x', y')
to plot out a grid -- the columns (or is it the rows..) are looped =20
through inside the plot command. Is this the sort of thing you have =20
in mind for the new 'parplot' command?
-Rob
On Jun 14, 2006, at 1:06 AM, Ga=EBl Varoquaux wrote:
> On Tue, Jun 13, 2006 at 05:33:57PM +0200, Ga=EBl Varoquaux wrote:
>> I find that "parplot" is not a great name for such a function, =20=
>> but I
>> cannot think of a better one. Maybe the list will have better ideas.
>
> Talking to a friend we came up with a name like "pathcolor", or
> "pathplot".
>
> 	Ga=EBl
>
>
> _______________________________________________
> Matplotlib-devel mailing list
> Mat...@li...
> https://lists.sourceforge.net/lists/listinfo/matplotlib-devel
-----
Rob Hetland, Assistant Professor
Dept of Oceanography, Texas A&M University
p: 979-458-0096, f: 979-845-6331
e: he...@ta..., w: http://pong.tamu.edu
From: V. <gae...@no...> - 2006年06月14日 06:06:53
On Tue, Jun 13, 2006 at 05:33:57PM +0200, Ga=EBl Varoquaux wrote:
> I find that "parplot" is not a great name for such a function, but =
I
> cannot think of a better one. Maybe the list will have better ideas.
 Talking to a friend we came up with a name like "pathcolor", or
"pathplot".
	Ga=EBl
From: Eric F. <ef...@ha...> - 2006年06月13日 20:08:18
Robert,
Thanks for the feedback. Comments are below. I think I addressed some 
items over the weekend, so the version in svn should work better than 
the one you tested, assuming you tested the original one I sent out as a 
diff.
Robert Hetland wrote:
> 
> Eric-
> 
> I had a chance to play around with the new quiver this morning. Here 
> are some thoughts:
> 
> 1. I read that somebody thought it was a bit slow, but this is not my 
> experience. I tried to quiver my model data (128x128 with masking), 
> and it rendered in a few seconds. I tried to look at images with more 
> arrows, but I could not actually see the arrows with so many points. 
> In my experience you can't put more than about O(100000) (i.e., about 
> 100x100) arrows on a figure and have it make any sense anyways.
> 
> 2. It seems to handle masking fine. I gave it masked arrays and 
> arrays with nans (it complained a bit about the masked array, but 
> rendered anyways). This is key for me.
> 
Interesting. I did nothing to explicitly deal with masked arrays or 
nans, so how well this works may be backend-dependent and/or platform 
dependent. If this turns out to be a problem, I can put in more 
explicit handling of masks and nans, but it is likely to add overhead.
Question: should handling of masks apply to all input arrays, or is it 
enough to have it apply to U, V, and C? It doesn't matter so much for 
Quiver, but it is one of the differences between pcolor and 
pcolormesh--support of masked X, Y complicates the former and might be 
hard to add to the latter. I hope it is unnecessary.
> 3. It handles colors nicely, but there is no apparent way to set the 
> color limits. Perhaps adding vmin and vmax as kwargs (matching pcolor) 
> would make some sense.
This brings up a more general design question: when should functionality 
like this be added via kwargs versus other mechanisms? It would be nice 
to have more uniformity among classes and functions, so that the general 
strategy could be documented once for a whole bunch of functions, 
instead of being repeated, with variations, for each. This would go 
along with better factoring out of chunks of functionality. In the 
absence of a big push, all this is going to have to happen piecemeal if 
at all, and complicated by the demands of backwards compatibility.
Returning to the specific question, however, I left vmin and vmax out to 
keep things simple, with the idea that there are other standard ways of 
setting them: via an explicit norm kwarg, via the clim method of any 
scalar mappable (which the Quiver is, since it inherits from 
PolyCollection), and via the pylab clim command. If there is a 
consensus that scalar mappables should uniformly support the vmin and 
vmax kwargs directly, then I don't mind adding that. I can see the 
argument for uniformity, given that this is common among scalar 
mappables. Probably it could be factored out by inclusion in 
ScalarMappable.__init__.
> 
> 4. I *really* like how the arrows get gray (and don't try to render at 
> a whole pixel) when they get small. I agree with the other person that 
> it might be nice to have a small dot to indicate zero velocity. No 
> dot should be rendered *ever* for values masked or NaNed, however.
> 
The graying is a function of the backend--specifically, whether 
rendering is antialiased.
The optional dot (actually a hexagon) for arrows below a threshold size 
is in svn, one of the changes I made over the weekend.
> 5. It's a bit of a pain to find values of scale and width that work 
> well, and quiver doesn't seem to make very good choices by default. I 
> don't think this is a big deal, but rather simply the price of the 
> added functionality. Making some more intelligent default choices 
> might not be a bad idea, though. In particular, smaller arrows when 
> there are more arrows to be rendered would be a good place to start.
> 
I changed the autoscaling so that both the length scale and the shaft 
width (and therefore the head size, which scales with shaft width) vary 
with the square root of the number of arrows, but with limits to prevent 
arrows from getting too large and fat or too small and thin. So I think 
you will find the svn version much improved in this regard.
> Some ideas for future work:
> 
> 1. I'm pretty happy with the polygons, but it would be nice to have 
> line collections instead. This would also facilitate my next idea:
> 
Not "line collections instead", but "in addition": the functionality and 
appearance are very different with polygons than with lines. I think I 
can add a line alternative with only a little extra code, but I won't 
even look at it again until this weekend, at the very earliest.
> 2. I would love to have a 'curly vector' tool. However, I'm not sure 
> how much to put into the curly_quiver package, and how much work must 
> be done by the user. I think that at a minimum, it would be nice to 
> give curly lines (i.e., an additional dimension to u and v)that have 
> arrows on the end of them, and leave it to the user to define what 
> those lines are somehow. If these lines could change color along their 
> track, then it would be the best thing ever made!
You can make curly lines with mapped colors using a LineCollection 
(recent change: it now inherits from ScalarMappable). Curly_quiver 
sounds like quite a different animal, though, and potentially much more 
difficult to do well; I don't think there would be much overlap with the 
present code in quiver, or with the code even after I add the line version.
Eric
From: Jeff W. <js...@fa...> - 2006年06月13日 17:14:00
With the latest svn matplotlib and numpy 0.9.8 I'm now getting:
Python 2.4.3 (#1, Mar 30 2006, 13:31:07)
[GCC 4.0.1 (Apple Computer, Inc. build 5247)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
 >>> import pylab
Traceback (most recent call last):
 File "<stdin>", line 1, in ?
 File "/Users/jsw/lib/python/pylab.py", line 1, in ?
 from matplotlib.pylab import *
 File "/Users/jsw/lib/python/matplotlib/pylab.py", line 196, in ?
 import cm
 File "/Users/jsw/lib/python/matplotlib/cm.py", line 5, in ?
 import colors
 File "/Users/jsw/lib/python/matplotlib/colors.py", line 33, in ?
 from numerix import array, arange, take, put, Float, Int, where, \
 File "/Users/jsw/lib/python/matplotlib/numerix/__init__.py", line 66, in ?
 import numpy.oldnumeric as numpy
ImportError: No module named oldnumeric
With numpy 0.9.8 I can do
from numpy.core import oldnumeric
but not
from numpy import oldnumeric.
I suppose this is a consequence of Travis's numerix commits yesterday - 
is the latest numpy svn now required?
-Jeff
-- 
Jeffrey S. Whitaker Phone : (303)497-6313
Meteorologist FAX : (303)497-6449
NOAA/OAR/PSD R/PSD1 Email : Jef...@no...
325 Broadway Office : Skaggs Research Cntr 1D-124
Boulder, CO, USA 80303-3328 Web : http://tinyurl.com/5telg
From: V. <gae...@no...> - 2006年06月13日 15:34:08
On Tue, Jun 13, 2006 at 09:21:44AM -0500, John Hunter wrote:
> I'm not at all opposed to making a helper function like scatter to plot
> parametric lines with colormaps, but I don't think "plot" is the right
> vehicle, since it returns a Line2d, not a LineCollection, and since it
> is already heavily overloaded. parplot? =20
 I think having a helper function is a good idea. Not only does it
help the newcomer, but it also allows the oldtimer to write shorter and
more readable code.
 I find that "parplot" is not a great name for such a function, but I
cannot think of a better one. Maybe the list will have better ideas.
 Ga=EBl
From: Robert H. <he...@ta...> - 2006年06月13日 15:14:00
Eric-
I had a chance to play around with the new quiver this morning. Here 
are some thoughts:
1. I read that somebody thought it was a bit slow, but this is not my 
experience. I tried to quiver my model data (128x128 with masking), 
and it rendered in a few seconds. I tried to look at images with 
more arrows, but I could not actually see the arrows with so many 
points. In my experience you can't put more than about O(100000) 
(i.e., about 100x100) arrows on a figure and have it make any sense 
anyways.
2. It seems to handle masking fine. I gave it masked arrays and 
arrays with nans (it complained a bit about the masked array, but 
rendered anyways). This is key for me.
3. It handles colors nicely, but there is no apparent way to set the 
color limits. Perhaps adding vmin and vmax as kwargs (matching 
pcolor) would make some sense.
4. I *really* like how the arrows get gray (and don't try to render 
at a whole pixel) when they get small. I agree with the other person 
that it might be nice to have a small dot to indicate zero 
velocity. No dot should be rendered *ever* for values masked or 
NaNed, however.
5. It's a bit of a pain to find values of scale and width that work 
well, and quiver doesn't seem to make very good choices by default. 
I don't think this is a big deal, but rather simply the price of the 
added functionality. Making some more intelligent default choices 
might not be a bad idea, though. In particular, smaller arrows when 
there are more arrows to be rendered would be a good place to start.
Some ideas for future work:
1. I'm pretty happy with the polygons, but it would be nice to have 
line collections instead. This would also facilitate my next idea:
2. I would love to have a 'curly vector' tool. However, I'm not sure 
how much to put into the curly_quiver package, and how much work must 
be done by the user. I think that at a minimum, it would be nice to 
give curly lines (i.e., an additional dimension to u and v)that have 
arrows on the end of them, and leave it to the user to define what 
those lines are somehow. If these lines could change color along 
their track, then it would be the best thing ever made!
-Rob.
-----
Rob Hetland, Assistant Professor
Dept of Oceanography, Texas A&M University
p: 979-458-0096, f: 979-845-6331
e: he...@ta..., w: http://pong.tamu.edu

Showing results of 86

<< < 1 2 3 4 > >> (Page 2 of 4)
Want the latest updates on software, tech news, and AI?
Get latest updates about software, tech news, and AI from SourceForge directly in your inbox once a month.
Thanks for helping keep SourceForge clean.
X





Briefly describe the problem (required):
Upload screenshot of ad (required):
Select a file, or drag & drop file here.
Screenshot instructions:

Click URL instructions:
Right-click on the ad, choose "Copy Link", then paste here →
(This may not be possible with some types of ads)

More information about our ad policies

Ad destination/click URL:

AltStyle によって変換されたページ (->オリジナル) /