Eric, I just hit a problem with using quiver with Basemap when when angles='xy'. Because Basemap's x,y units are in meters, you end up with angles that are quantized due to floating point truncation (30000. + 0.001*u = 30000.). Changing to angles='uv' fixes the problem, but it probably should be automatically scaled, as noted in the comments: elif self.angles == 'xy' or self.scale_units == 'xy': # We could refine this by calculating eps based on # the magnitude of U, V relative to that of X, Y, # to ensure we are always making small shifts in X, Y. I managed to fix the problem locally by setting: angles, lengths = self._angles_lengths(U, V, eps=0.0001 * self.XY.max()) but I'm not sure if you would want a different fix. If you're happy with this fix, I'll go ahead an check in. Ryan -- Ryan May Graduate Research Assistant School of Meteorology University of Oklahoma
Ping. Not sure if you missed it first time around or are just that busy. Ryan On Fri, Mar 26, 2010 at 12:13 PM, Ryan May <rm...@gm...> wrote: > Eric, > > I just hit a problem with using quiver with Basemap when when > angles='xy'. Because Basemap's x,y units are in meters, you end up > with angles that are quantized due to floating point truncation > (30000. + 0.001*u = 30000.). Changing to angles='uv' fixes the > problem, but it probably should be automatically scaled, as noted in > the comments: > > elif self.angles == 'xy' or self.scale_units == 'xy': > # We could refine this by calculating eps based on > # the magnitude of U, V relative to that of X, Y, > # to ensure we are always making small shifts in X, Y. > > I managed to fix the problem locally by setting: > > angles, lengths = self._angles_lengths(U, V, eps=0.0001 * > self.XY.max()) > > but I'm not sure if you would want a different fix. If you're happy > with this fix, I'll go ahead an check in. Ryan -- Ryan May Graduate Research Assistant School of Meteorology University of Oklahoma
Ryan May wrote: > Ping. Not sure if you missed it first time around or are just that busy. > I looked, but decided I needed to look again, and then lost it in the stack. See below. > Ryan > > On Fri, Mar 26, 2010 at 12:13 PM, Ryan May <rm...@gm...> wrote: >> Eric, >> >> I just hit a problem with using quiver with Basemap when when >> angles='xy'. Because Basemap's x,y units are in meters, you end up >> with angles that are quantized due to floating point truncation >> (30000. + 0.001*u = 30000.). Changing to angles='uv' fixes the >> problem, but it probably should be automatically scaled, as noted in >> the comments: >> >> elif self.angles == 'xy' or self.scale_units == 'xy': >> # We could refine this by calculating eps based on >> # the magnitude of U, V relative to that of X, Y, >> # to ensure we are always making small shifts in X, Y. >> >> I managed to fix the problem locally by setting: >> >> angles, lengths = self._angles_lengths(U, V, eps=0.0001 * >> self.XY.max()) >> I don't think this will work in all cases. For example, there could be a single arrow at (0,0). Instead of self.XY.max(), how about abs(self.ax.dataLim.width)? Eric >> but I'm not sure if you would want a different fix. If you're happy >> with this fix, I'll go ahead an check in. > > Ryan >
On Fri, Apr 2, 2010 at 1:23 AM, Eric Firing <ef...@ha...> wrote: >> On Fri, Mar 26, 2010 at 12:13 PM, Ryan May <rm...@gm...> wrote: >>> I just hit a problem with using quiver with Basemap when when >>> angles='xy'. Because Basemap's x,y units are in meters, you end up >>> with angles that are quantized due to floating point truncation >>> (30000. + 0.001*u = 30000.). Changing to angles='uv' fixes the >>> problem, but it probably should be automatically scaled, as noted in >>> the comments: >>> >>> elif self.angles == 'xy' or self.scale_units == 'xy': >>> # We could refine this by calculating eps based on >>> # the magnitude of U, V relative to that of X, Y, >>> # to ensure we are always making small shifts in X, Y. >>> >>> I managed to fix the problem locally by setting: >>> >>> angles, lengths = self._angles_lengths(U, V, eps=0.0001 * >>> self.XY.max()) >>> > > I don't think this will work in all cases. For example, there could be a > single arrow at (0,0). Good point. > Instead of self.XY.max(), how about abs(self.ax.dataLim.width)? Wouldn't this have problems if we zoom in sufficiently that the width is much less than magnitude of the values? Not exactly sure what data set would sensibly yield this, so I'm not sure if we should worry about it. If we do care, we could just put a minimum bound on eps: eps=max(1e-8, 0.0001 * self.XY.max()) Ryan -- Ryan May Graduate Research Assistant School of Meteorology University of Oklahoma
Ryan May wrote: > On Fri, Apr 2, 2010 at 1:23 AM, Eric Firing <ef...@ha...> wrote: >>> On Fri, Mar 26, 2010 at 12:13 PM, Ryan May <rm...@gm...> wrote: >>>> I just hit a problem with using quiver with Basemap when when >>>> angles='xy'. Because Basemap's x,y units are in meters, you end up >>>> with angles that are quantized due to floating point truncation >>>> (30000. + 0.001*u = 30000.). Changing to angles='uv' fixes the >>>> problem, but it probably should be automatically scaled, as noted in >>>> the comments: >>>> >>>> elif self.angles == 'xy' or self.scale_units == 'xy': >>>> # We could refine this by calculating eps based on >>>> # the magnitude of U, V relative to that of X, Y, >>>> # to ensure we are always making small shifts in X, Y. >>>> >>>> I managed to fix the problem locally by setting: >>>> >>>> angles, lengths = self._angles_lengths(U, V, eps=0.0001 * >>>> self.XY.max()) >>>> >> I don't think this will work in all cases. For example, there could be a >> single arrow at (0,0). > > Good point. > >> Instead of self.XY.max(), how about abs(self.ax.dataLim.width)? > > Wouldn't this have problems if we zoom in sufficiently that the width > is much less than magnitude of the values? Not exactly sure what data > set would sensibly yield this, so I'm not sure if we should worry > about it. > > If we do care, we could just put a minimum bound on eps: > > eps=max(1e-8, 0.0001 * self.XY.max()) I don't like taking the max of a potentially large array every time; and one needs max absolute value in any case. I think the following is better: eps = np.abs(self.ax.dataLim.extents).max() * 0.001 Eric > > Ryan >
On Fri, Apr 2, 2010 at 11:42 AM, Eric Firing <ef...@ha...> wrote: > Ryan May wrote: >> >> On Fri, Apr 2, 2010 at 1:23 AM, Eric Firing <ef...@ha...> wrote: >>>> >>>> On Fri, Mar 26, 2010 at 12:13 PM, Ryan May <rm...@gm...> wrote: >>>>> >>>>> I just hit a problem with using quiver with Basemap when when >>>>> angles='xy'. Because Basemap's x,y units are in meters, you end up >>>>> with angles that are quantized due to floating point truncation >>>>> (30000. + 0.001*u = 30000.). Changing to angles='uv' fixes the >>>>> problem, but it probably should be automatically scaled, as noted in >>>>> the comments: >>>>> >>>>> elif self.angles == 'xy' or self.scale_units == 'xy': >>>>> # We could refine this by calculating eps based on >>>>> # the magnitude of U, V relative to that of X, Y, >>>>> # to ensure we are always making small shifts in X, Y. >>>>> >>>>> I managed to fix the problem locally by setting: >>>>> >>>>> angles, lengths = self._angles_lengths(U, V, eps=0.0001 * >>>>> self.XY.max()) >>>>> >>> I don't think this will work in all cases. For example, there could be a >>> single arrow at (0,0). >> >> Good point. >> >>> Instead of self.XY.max(), how about abs(self.ax.dataLim.width)? >> >> Wouldn't this have problems if we zoom in sufficiently that the width >> is much less than magnitude of the values? Not exactly sure what data >> set would sensibly yield this, so I'm not sure if we should worry >> about it. >> >> If we do care, we could just put a minimum bound on eps: >> >> eps=max(1e-8, 0.0001 * self.XY.max()) > > I don't like taking the max of a potentially large array every time; and one > needs max absolute value in any case. I think the following is better: > > eps = np.abs(self.ax.dataLim.extents).max() * 0.001 I hadn't thought about performance. I think that's more important than any worries about bounds being disproportionately smaller. I'll check this in. Ryan -- Ryan May Graduate Research Assistant School of Meteorology University of Oklahoma
Ryan May wrote: > On Fri, Apr 2, 2010 at 11:42 AM, Eric Firing <ef...@ha...> wrote: >> Ryan May wrote: >>> On Fri, Apr 2, 2010 at 1:23 AM, Eric Firing <ef...@ha...> wrote: >>>>> On Fri, Mar 26, 2010 at 12:13 PM, Ryan May <rm...@gm...> wrote: >>>>>> I just hit a problem with using quiver with Basemap when when >>>>>> angles='xy'. Because Basemap's x,y units are in meters, you end up >>>>>> with angles that are quantized due to floating point truncation >>>>>> (30000. + 0.001*u = 30000.). Changing to angles='uv' fixes the >>>>>> problem, but it probably should be automatically scaled, as noted in >>>>>> the comments: >>>>>> >>>>>> elif self.angles == 'xy' or self.scale_units == 'xy': >>>>>> # We could refine this by calculating eps based on >>>>>> # the magnitude of U, V relative to that of X, Y, >>>>>> # to ensure we are always making small shifts in X, Y. >>>>>> >>>>>> I managed to fix the problem locally by setting: >>>>>> >>>>>> angles, lengths = self._angles_lengths(U, V, eps=0.0001 * >>>>>> self.XY.max()) >>>>>> >>>> I don't think this will work in all cases. For example, there could be a >>>> single arrow at (0,0). >>> Good point. >>> >>>> Instead of self.XY.max(), how about abs(self.ax.dataLim.width)? >>> Wouldn't this have problems if we zoom in sufficiently that the width >>> is much less than magnitude of the values? Not exactly sure what data >>> set would sensibly yield this, so I'm not sure if we should worry >>> about it. >>> >>> If we do care, we could just put a minimum bound on eps: >>> >>> eps=max(1e-8, 0.0001 * self.XY.max()) >> I don't like taking the max of a potentially large array every time; and one >> needs max absolute value in any case. I think the following is better: >> >> eps = np.abs(self.ax.dataLim.extents).max() * 0.001 > > I hadn't thought about performance. I think that's more important > than any worries about bounds being disproportionately smaller. I'll > check this in. Sorry for the piecemeal approach in thinking about this--but now I realize that to do this right, as indicated by the comment in the original code, we need to take the magnitude of U and V into account. The maximum magnitude could be calculated once in set_UVC and then saved so that it does not have to be recalculated every time it is used in make_verts. Maybe I am still missing some simpler way to handle this well. Eric > > Ryan >