SourceForge logo
SourceForge logo
Menu

matplotlib-devel

From: Ryan M. <rm...@gm...> - 2010年03月26日 18:13:19
Eric,
I just hit a problem with using quiver with Basemap when when
angles='xy'. Because Basemap's x,y units are in meters, you end up
with angles that are quantized due to floating point truncation
(30000. + 0.001*u = 30000.). Changing to angles='uv' fixes the
problem, but it probably should be automatically scaled, as noted in
the comments:
 elif self.angles == 'xy' or self.scale_units == 'xy':
 # We could refine this by calculating eps based on
 # the magnitude of U, V relative to that of X, Y,
 # to ensure we are always making small shifts in X, Y.
I managed to fix the problem locally by setting:
 angles, lengths = self._angles_lengths(U, V, eps=0.0001 *
self.XY.max())
but I'm not sure if you would want a different fix. If you're happy
with this fix, I'll go ahead an check in.
Ryan
-- 
Ryan May
Graduate Research Assistant
School of Meteorology
University of Oklahoma
From: Ryan M. <rm...@gm...> - 2010年04月02日 02:15:11
Ping. Not sure if you missed it first time around or are just that busy.
Ryan
On Fri, Mar 26, 2010 at 12:13 PM, Ryan May <rm...@gm...> wrote:
> Eric,
>
> I just hit a problem with using quiver with Basemap when when
> angles='xy'. Because Basemap's x,y units are in meters, you end up
> with angles that are quantized due to floating point truncation
> (30000. + 0.001*u = 30000.). Changing to angles='uv' fixes the
> problem, but it probably should be automatically scaled, as noted in
> the comments:
>
>    elif self.angles == 'xy' or self.scale_units == 'xy':
>      # We could refine this by calculating eps based on
>      # the magnitude of U, V relative to that of X, Y,
>      # to ensure we are always making small shifts in X, Y.
>
> I managed to fix the problem locally by setting:
>
>      angles, lengths = self._angles_lengths(U, V, eps=0.0001 *
> self.XY.max())
>
> but I'm not sure if you would want a different fix. If you're happy
> with this fix, I'll go ahead an check in.
Ryan
-- 
Ryan May
Graduate Research Assistant
School of Meteorology
University of Oklahoma
From: Eric F. <ef...@ha...> - 2010年04月02日 07:24:10
Ryan May wrote:
> Ping. Not sure if you missed it first time around or are just that busy.
> 
I looked, but decided I needed to look again, and then lost it in the 
stack. See below.
> Ryan
> 
> On Fri, Mar 26, 2010 at 12:13 PM, Ryan May <rm...@gm...> wrote:
>> Eric,
>>
>> I just hit a problem with using quiver with Basemap when when
>> angles='xy'. Because Basemap's x,y units are in meters, you end up
>> with angles that are quantized due to floating point truncation
>> (30000. + 0.001*u = 30000.). Changing to angles='uv' fixes the
>> problem, but it probably should be automatically scaled, as noted in
>> the comments:
>>
>> elif self.angles == 'xy' or self.scale_units == 'xy':
>> # We could refine this by calculating eps based on
>> # the magnitude of U, V relative to that of X, Y,
>> # to ensure we are always making small shifts in X, Y.
>>
>> I managed to fix the problem locally by setting:
>>
>> angles, lengths = self._angles_lengths(U, V, eps=0.0001 *
>> self.XY.max())
>>
I don't think this will work in all cases. For example, there could be 
a single arrow at (0,0).
Instead of self.XY.max(), how about abs(self.ax.dataLim.width)?
Eric
>> but I'm not sure if you would want a different fix. If you're happy
>> with this fix, I'll go ahead an check in.
> 
> Ryan
> 
From: Ryan M. <rm...@gm...> - 2010年04月02日 14:02:57
On Fri, Apr 2, 2010 at 1:23 AM, Eric Firing <ef...@ha...> wrote:
>> On Fri, Mar 26, 2010 at 12:13 PM, Ryan May <rm...@gm...> wrote:
>>> I just hit a problem with using quiver with Basemap when when
>>> angles='xy'. Because Basemap's x,y units are in meters, you end up
>>> with angles that are quantized due to floating point truncation
>>> (30000. + 0.001*u = 30000.). Changing to angles='uv' fixes the
>>> problem, but it probably should be automatically scaled, as noted in
>>> the comments:
>>>
>>>    elif self.angles == 'xy' or self.scale_units == 'xy':
>>>      # We could refine this by calculating eps based on
>>>      # the magnitude of U, V relative to that of X, Y,
>>>      # to ensure we are always making small shifts in X, Y.
>>>
>>> I managed to fix the problem locally by setting:
>>>
>>>      angles, lengths = self._angles_lengths(U, V, eps=0.0001 *
>>> self.XY.max())
>>>
>
> I don't think this will work in all cases. For example, there could be a
> single arrow at (0,0).
Good point.
> Instead of self.XY.max(), how about abs(self.ax.dataLim.width)?
Wouldn't this have problems if we zoom in sufficiently that the width
is much less than magnitude of the values? Not exactly sure what data
set would sensibly yield this, so I'm not sure if we should worry
about it.
If we do care, we could just put a minimum bound on eps:
eps=max(1e-8, 0.0001 * self.XY.max())
Ryan
-- 
Ryan May
Graduate Research Assistant
School of Meteorology
University of Oklahoma
From: Eric F. <ef...@ha...> - 2010年04月02日 17:42:27
Ryan May wrote:
> On Fri, Apr 2, 2010 at 1:23 AM, Eric Firing <ef...@ha...> wrote:
>>> On Fri, Mar 26, 2010 at 12:13 PM, Ryan May <rm...@gm...> wrote:
>>>> I just hit a problem with using quiver with Basemap when when
>>>> angles='xy'. Because Basemap's x,y units are in meters, you end up
>>>> with angles that are quantized due to floating point truncation
>>>> (30000. + 0.001*u = 30000.). Changing to angles='uv' fixes the
>>>> problem, but it probably should be automatically scaled, as noted in
>>>> the comments:
>>>>
>>>> elif self.angles == 'xy' or self.scale_units == 'xy':
>>>> # We could refine this by calculating eps based on
>>>> # the magnitude of U, V relative to that of X, Y,
>>>> # to ensure we are always making small shifts in X, Y.
>>>>
>>>> I managed to fix the problem locally by setting:
>>>>
>>>> angles, lengths = self._angles_lengths(U, V, eps=0.0001 *
>>>> self.XY.max())
>>>>
>> I don't think this will work in all cases. For example, there could be a
>> single arrow at (0,0).
> 
> Good point.
> 
>> Instead of self.XY.max(), how about abs(self.ax.dataLim.width)?
> 
> Wouldn't this have problems if we zoom in sufficiently that the width
> is much less than magnitude of the values? Not exactly sure what data
> set would sensibly yield this, so I'm not sure if we should worry
> about it.
> 
> If we do care, we could just put a minimum bound on eps:
> 
> eps=max(1e-8, 0.0001 * self.XY.max())
I don't like taking the max of a potentially large array every time; and 
one needs max absolute value in any case. I think the following is better:
eps = np.abs(self.ax.dataLim.extents).max() * 0.001
Eric
> 
> Ryan
> 
From: Ryan M. <rm...@gm...> - 2010年04月02日 18:05:38
On Fri, Apr 2, 2010 at 11:42 AM, Eric Firing <ef...@ha...> wrote:
> Ryan May wrote:
>>
>> On Fri, Apr 2, 2010 at 1:23 AM, Eric Firing <ef...@ha...> wrote:
>>>>
>>>> On Fri, Mar 26, 2010 at 12:13 PM, Ryan May <rm...@gm...> wrote:
>>>>>
>>>>> I just hit a problem with using quiver with Basemap when when
>>>>> angles='xy'. Because Basemap's x,y units are in meters, you end up
>>>>> with angles that are quantized due to floating point truncation
>>>>> (30000. + 0.001*u = 30000.). Changing to angles='uv' fixes the
>>>>> problem, but it probably should be automatically scaled, as noted in
>>>>> the comments:
>>>>>
>>>>>   elif self.angles == 'xy' or self.scale_units == 'xy':
>>>>>     # We could refine this by calculating eps based on
>>>>>     # the magnitude of U, V relative to that of X, Y,
>>>>>     # to ensure we are always making small shifts in X, Y.
>>>>>
>>>>> I managed to fix the problem locally by setting:
>>>>>
>>>>>     angles, lengths = self._angles_lengths(U, V, eps=0.0001 *
>>>>> self.XY.max())
>>>>>
>>> I don't think this will work in all cases. For example, there could be a
>>> single arrow at (0,0).
>>
>> Good point.
>>
>>> Instead of self.XY.max(), how about abs(self.ax.dataLim.width)?
>>
>> Wouldn't this have problems if we zoom in sufficiently that the width
>> is much less than magnitude of the values? Not exactly sure what data
>> set would sensibly yield this, so I'm not sure if we should worry
>> about it.
>>
>> If we do care, we could just put a minimum bound on eps:
>>
>> eps=max(1e-8, 0.0001 * self.XY.max())
>
> I don't like taking the max of a potentially large array every time; and one
> needs max absolute value in any case. I think the following is better:
>
> eps = np.abs(self.ax.dataLim.extents).max() * 0.001
I hadn't thought about performance. I think that's more important
than any worries about bounds being disproportionately smaller. I'll
check this in.
Ryan
-- 
Ryan May
Graduate Research Assistant
School of Meteorology
University of Oklahoma
From: Eric F. <ef...@ha...> - 2010年04月02日 18:46:06
Ryan May wrote:
> On Fri, Apr 2, 2010 at 11:42 AM, Eric Firing <ef...@ha...> wrote:
>> Ryan May wrote:
>>> On Fri, Apr 2, 2010 at 1:23 AM, Eric Firing <ef...@ha...> wrote:
>>>>> On Fri, Mar 26, 2010 at 12:13 PM, Ryan May <rm...@gm...> wrote:
>>>>>> I just hit a problem with using quiver with Basemap when when
>>>>>> angles='xy'. Because Basemap's x,y units are in meters, you end up
>>>>>> with angles that are quantized due to floating point truncation
>>>>>> (30000. + 0.001*u = 30000.). Changing to angles='uv' fixes the
>>>>>> problem, but it probably should be automatically scaled, as noted in
>>>>>> the comments:
>>>>>>
>>>>>> elif self.angles == 'xy' or self.scale_units == 'xy':
>>>>>> # We could refine this by calculating eps based on
>>>>>> # the magnitude of U, V relative to that of X, Y,
>>>>>> # to ensure we are always making small shifts in X, Y.
>>>>>>
>>>>>> I managed to fix the problem locally by setting:
>>>>>>
>>>>>> angles, lengths = self._angles_lengths(U, V, eps=0.0001 *
>>>>>> self.XY.max())
>>>>>>
>>>> I don't think this will work in all cases. For example, there could be a
>>>> single arrow at (0,0).
>>> Good point.
>>>
>>>> Instead of self.XY.max(), how about abs(self.ax.dataLim.width)?
>>> Wouldn't this have problems if we zoom in sufficiently that the width
>>> is much less than magnitude of the values? Not exactly sure what data
>>> set would sensibly yield this, so I'm not sure if we should worry
>>> about it.
>>>
>>> If we do care, we could just put a minimum bound on eps:
>>>
>>> eps=max(1e-8, 0.0001 * self.XY.max())
>> I don't like taking the max of a potentially large array every time; and one
>> needs max absolute value in any case. I think the following is better:
>>
>> eps = np.abs(self.ax.dataLim.extents).max() * 0.001
> 
> I hadn't thought about performance. I think that's more important
> than any worries about bounds being disproportionately smaller. I'll
> check this in.
Sorry for the piecemeal approach in thinking about this--but now I 
realize that to do this right, as indicated by the comment in the 
original code, we need to take the magnitude of U and V into account. 
The maximum magnitude could be calculated once in set_UVC and then saved 
so that it does not have to be recalculated every time it is used in 
make_verts.
Maybe I am still missing some simpler way to handle this well.
Eric
> 
> Ryan
> 
Want the latest updates on software, tech news, and AI?
Get latest updates about software, tech news, and AI from SourceForge directly in your inbox once a month.
Thanks for helping keep SourceForge clean.
X





Briefly describe the problem (required):
Upload screenshot of ad (required):
Select a file, or drag & drop file here.
Screenshot instructions:

Click URL instructions:
Right-click on the ad, choose "Copy Link", then paste here →
(This may not be possible with some types of ads)

More information about our ad policies

Ad destination/click URL:

AltStyle によって変換されたページ (->オリジナル) /