[OpenLayers-Dev] Cluster strategy getResolution bug ?

Eric Lemoine eric.c2c at gmail.com
Sun Nov 9 02:52:45 EST 2008

Hi Alexandre

If you want your vector layer to be visible only for a subset of the
base layer's resolutions you should set maxResolution and/or
minResolution in your vector layer options.

If you pass an array of resolutions to the vector layer, you can end
up with vectorlayer.resolutions[i]  != baselayer.resolutions[i], where
i is any index of the vector layer's resolutions array.

And I think you are hitting the above case, which is making the
cluster strategy doesn't behave as you expect it to.

So I don't see anything wrong with the cluster strategy relying on

I'd really like to hear what others think about that.



2008/11/7, Alexandre Dube <adube at mapgears.com>:
> Hi devs,
>   I got a problem using the cluster strategy.  At first, it didn't
> seemed to work at all.  I always got the same number of feature with or
> without the cluster strategy applied.  After a few search, I found out
> the problem was at line 143 : *var* resolution =
> *this*.layer.getResolution();
>   I just replaced the line for : *var* resolution =
> *this*.layer.map.getResolution(); and it worked.
>   The reason is :
>   My base layer has 10 fixed scales ( which gives 10 resolutions), from
> 0 to 9.  My vector layer has only 7, which represents resolutions 3 to 9
> of the base layer's scales, but the vector layer itself has his own
> scales array from 0 to 6.  So, the line 143 which called resolution 3
> from the base layer calls resolution 3 from the vector layer, which is
> wrong !
>   Anyone else noticed this ?
> --
> Alexandre Dubé
> Mapgears
> www.mapgears.com
> _______________________________________________
> Dev mailing list
> Dev at openlayers.org
> http://openlayers.org/mailman/listinfo/dev

More information about the Dev mailing list