 # Problem with Quiver+Basemap

Eric,

I just hit a problem with using quiver with Basemap when when
angles='xy'. Because Basemap's x,y units are in meters, you end up
with angles that are quantized due to floating point truncation
(30000. + 0.001*u = 30000.). Changing to angles='uv' fixes the
problem, but it probably should be automatically scaled, as noted in

elif self.angles == 'xy' or self.scale_units == 'xy':
# We could refine this by calculating eps based on
# the magnitude of U, V relative to that of X, Y,
# to ensure we are always making small shifts in X, Y.

I managed to fix the problem locally by setting:

angles, lengths = self._angles_lengths(U, V, eps=0.0001 *
self.XY.max())

but I'm not sure if you would want a different fix. If you're happy
with this fix, I'll go ahead an check in.

Ryan

···

--
Ryan May
School of Meteorology
University of Oklahoma

Ping. Not sure if you missed it first time around or are just that busy.

Ryan

···

On Fri, Mar 26, 2010 at 12:13 PM, Ryan May <rmay31@...149...> wrote:

Eric,

I just hit a problem with using quiver with Basemap when when
angles='xy'. Because Basemap's x,y units are in meters, you end up
with angles that are quantized due to floating point truncation
(30000. + 0.001*u = 30000.). Changing to angles='uv' fixes the
problem, but it probably should be automatically scaled, as noted in

``````   elif self\.angles == &#39;xy&#39; or self\.scale\_units == &#39;xy&#39;:
\# We could refine this by calculating eps based on
\# the magnitude of U, V relative to that of X, Y,
\# to ensure we are always making small shifts in X, Y\.
``````

I managed to fix the problem locally by setting:

``````       angles, lengths = self\.\_angles\_lengths\(U, V, eps=0\.0001 \*
``````

self.XY.max())

but I'm not sure if you would want a different fix. If you're happy
with this fix, I'll go ahead an check in.

Ryan

--
Ryan May
School of Meteorology
University of Oklahoma

Ryan May wrote:

Ping. Not sure if you missed it first time around or are just that busy.

I looked, but decided I needed to look again, and then lost it in the stack. See below.

Ryan

Eric,

I just hit a problem with using quiver with Basemap when when
angles='xy'. Because Basemap's x,y units are in meters, you end up
with angles that are quantized due to floating point truncation
(30000. + 0.001*u = 30000.). Changing to angles='uv' fixes the
problem, but it probably should be automatically scaled, as noted in

elif self.angles == 'xy' or self.scale_units == 'xy':
# We could refine this by calculating eps based on
# the magnitude of U, V relative to that of X, Y,
# to ensure we are always making small shifts in X, Y.

I managed to fix the problem locally by setting:

angles, lengths = self._angles_lengths(U, V, eps=0.0001 *
self.XY.max())

I don't think this will work in all cases. For example, there could be a single arrow at (0,0).

Eric

···

On Fri, Mar 26, 2010 at 12:13 PM, Ryan May <rmay31@...149...> wrote:

but I'm not sure if you would want a different fix. If you're happy
with this fix, I'll go ahead an check in.

Ryan

I just hit a problem with using quiver with Basemap when when
angles='xy'. Because Basemap's x,y units are in meters, you end up
with angles that are quantized due to floating point truncation
(30000. + 0.001*u = 30000.). Changing to angles='uv' fixes the
problem, but it probably should be automatically scaled, as noted in

``````  elif self\.angles == &#39;xy&#39; or self\.scale\_units == &#39;xy&#39;:
\# We could refine this by calculating eps based on
\# the magnitude of U, V relative to that of X, Y,
\# to ensure we are always making small shifts in X, Y\.
``````

I managed to fix the problem locally by setting:

``````      angles, lengths = self\.\_angles\_lengths\(U, V, eps=0\.0001 \*
``````

self.XY.max())

I don't think this will work in all cases. For example, there could be a
single arrow at (0,0).

Good point.

Wouldn't this have problems if we zoom in sufficiently that the width
is much less than magnitude of the values? Not exactly sure what data
set would sensibly yield this, so I'm not sure if we should worry

If we do care, we could just put a minimum bound on eps:

eps=max(1e-8, 0.0001 * self.XY.max())

Ryan

···

On Fri, Apr 2, 2010 at 1:23 AM, Eric Firing <efiring@...229...> wrote:

On Fri, Mar 26, 2010 at 12:13 PM, Ryan May <rmay31@...149...> wrote:

--
Ryan May
School of Meteorology
University of Oklahoma

Ryan May wrote:

I just hit a problem with using quiver with Basemap when when
angles='xy'. Because Basemap's x,y units are in meters, you end up
with angles that are quantized due to floating point truncation
(30000. + 0.001*u = 30000.). Changing to angles='uv' fixes the
problem, but it probably should be automatically scaled, as noted in

elif self.angles == 'xy' or self.scale_units == 'xy':
# We could refine this by calculating eps based on
# the magnitude of U, V relative to that of X, Y,
# to ensure we are always making small shifts in X, Y.

I managed to fix the problem locally by setting:

angles, lengths = self._angles_lengths(U, V, eps=0.0001 *
self.XY.max())

I don't think this will work in all cases. For example, there could be a
single arrow at (0,0).

Good point.

Wouldn't this have problems if we zoom in sufficiently that the width
is much less than magnitude of the values? Not exactly sure what data
set would sensibly yield this, so I'm not sure if we should worry

If we do care, we could just put a minimum bound on eps:

eps=max(1e-8, 0.0001 * self.XY.max())

I don't like taking the max of a potentially large array every time; and one needs max absolute value in any case. I think the following is better:

eps = np.abs(self.ax.dataLim.extents).max() * 0.001

Eric

···

On Fri, Apr 2, 2010 at 1:23 AM, Eric Firing <efiring@...229...> wrote:

On Fri, Mar 26, 2010 at 12:13 PM, Ryan May <rmay31@...149...> wrote:

Ryan

than any worries about bounds being disproportionately smaller. I'll
check this in.

Ryan

···

On Fri, Apr 2, 2010 at 11:42 AM, Eric Firing <efiring@...229...> wrote:

Ryan May wrote:

On Fri, Apr 2, 2010 at 1:23 AM, Eric Firing <efiring@...229...> wrote:

On Fri, Mar 26, 2010 at 12:13 PM, Ryan May <rmay31@...149...> wrote:

I just hit a problem with using quiver with Basemap when when
angles='xy'. Because Basemap's x,y units are in meters, you end up
with angles that are quantized due to floating point truncation
(30000. + 0.001*u = 30000.). Changing to angles='uv' fixes the
problem, but it probably should be automatically scaled, as noted in

`````` elif self\.angles == &#39;xy&#39; or self\.scale\_units == &#39;xy&#39;:
\# We could refine this by calculating eps based on
\# the magnitude of U, V relative to that of X, Y,
\# to ensure we are always making small shifts in X, Y\.
``````

I managed to fix the problem locally by setting:

``````     angles, lengths = self\.\_angles\_lengths\(U, V, eps=0\.0001 \*
``````

self.XY.max())

I don't think this will work in all cases. For example, there could be a
single arrow at (0,0).

Good point.

Wouldn't this have problems if we zoom in sufficiently that the width
is much less than magnitude of the values? Not exactly sure what data
set would sensibly yield this, so I'm not sure if we should worry

If we do care, we could just put a minimum bound on eps:

eps=max(1e-8, 0.0001 * self.XY.max())

I don't like taking the max of a potentially large array every time; and one
needs max absolute value in any case. I think the following is better:

eps = np.abs(self.ax.dataLim.extents).max() * 0.001

--
Ryan May
School of Meteorology
University of Oklahoma

Ryan May wrote:

Ryan May wrote:

I just hit a problem with using quiver with Basemap when when
angles='xy'. Because Basemap's x,y units are in meters, you end up
with angles that are quantized due to floating point truncation
(30000. + 0.001*u = 30000.). Changing to angles='uv' fixes the
problem, but it probably should be automatically scaled, as noted in

elif self.angles == 'xy' or self.scale_units == 'xy':
# We could refine this by calculating eps based on
# the magnitude of U, V relative to that of X, Y,
# to ensure we are always making small shifts in X, Y.

I managed to fix the problem locally by setting:

angles, lengths = self._angles_lengths(U, V, eps=0.0001 *
self.XY.max())

I don't think this will work in all cases. For example, there could be a
single arrow at (0,0).

Good point.

Wouldn't this have problems if we zoom in sufficiently that the width
is much less than magnitude of the values? Not exactly sure what data
set would sensibly yield this, so I'm not sure if we should worry

If we do care, we could just put a minimum bound on eps:

eps=max(1e-8, 0.0001 * self.XY.max())

I don't like taking the max of a potentially large array every time; and one
needs max absolute value in any case. I think the following is better:

eps = np.abs(self.ax.dataLim.extents).max() * 0.001

than any worries about bounds being disproportionately smaller. I'll
check this in.

Sorry for the piecemeal approach in thinking about this--but now I realize that to do this right, as indicated by the comment in the original code, we need to take the magnitude of U and V into account. The maximum magnitude could be calculated once in set_UVC and then saved so that it does not have to be recalculated every time it is used in make_verts.

Maybe I am still missing some simpler way to handle this well.

Eric

···

On Fri, Apr 2, 2010 at 11:42 AM, Eric Firing <efiring@...229...> wrote:

On Fri, Apr 2, 2010 at 1:23 AM, Eric Firing <efiring@...229...> wrote:

On Fri, Mar 26, 2010 at 12:13 PM, Ryan May <rmay31@...149...> wrote:

Ryan