feature request: split out distance_to_dimension, distance_to_node, and compare

Sylvain Bougerel sylvain.bougerel.devel at gmail.com
Sat Aug 8 13:42:12 UTC 2009


Sorry, error on find_nearest. It's written like:

template <DistType, DistFunc, DistCmp = std::less<DistType> >
std::pair<const_iterator, DistType>
find_nearest (const value_type& __val,
                    DistType max,
                    const DistFunc& df = DistFunc(),
                    const DistCmp& dc = DistCmp() ) const
{
 // code
}

Cheers,

On Sat, Aug 8, 2009 at 9:38 PM, Sylvain
Bougerel<sylvain.bougerel.devel at gmail.com> wrote:
> Hi,
>
> Shane, I had reached the same design as you did. My code for this
> functor, at that time, was looking like this:
>
> template <typename _Tp, typename _Acc, typename _Dist>
> struct metric_euclidean_
> {
>    typedef _Dist difference_type;
>
>    distance_type
>    operator() (const _Tp& __a, const _Tp& __b, ) const
>    {
>      // code here
>    }
>
>    distance_type
>    operator() (const size_t dim, const _Tp& __a, const _Tp& __b) const
>    {
>       // code here
>    }
> }
>
> As you can see, the operator 'operator()' is overloaded in order to
> accomodate for what you've named 'node distance' and 'distance to
> dimension'.
>
> One more re-design that I had in mind was simply to remove the "Dist"
> template parameter from the Kdtree. Indeed Kdtree operates on vector
> spaces only, and does not need a metric. The Dist type is therefore
> extra luggage until somebody actually uses a nearest neighbor search
> -- and in fact some ppl only care about orthogonal range search.
> 'find_nearest' and 'find_nearest_if' are the only function operating
> on metric space.
>
> So, in this framework find_nearest would look like this:
>
> template <DistType, DistFunc, DistCmp = std::less<DistType> >
> std::pair<const_iterator, DistType>
> find_nearest (const value_type& __val,
>                    const DistFunc& df = DistFunc(),
>                    const DistCmp& dc = DistCmp() ) const
> {
>  // code
> }
>
> With the above, you get the equivalent to 'compare in dimension'.
> There is no reason why you would choose this design over yours. I was
> bearly mentioning that I find your reasoning plain right and proposing
> this as an additional code to consider. As I said, I have never
> implemented the rest, and alas, all functions involved with
> find_nearest have to be reworked to accomodate this new, more
> versatile functor.
>
> Cheers,
>
> Sylvain
>
> On Wed, Jul 8, 2009 at 11:03 AM, Paul Harris<paulharris at computer.org> wrote:
>> I agree.  Both myself and Sylvain had wanted to implement something like
>> this, but I don't have the time and I haven't heard from Sylvain in a while.
>>
>> If you like, please go ahead and hack something up - make sure you use the
>> GIT code, not the tarball, as that is what we'll be committing back to.
>>
>> I have some time to share opinions on the way you implement the changes, but
>> thats about it at the moment...  I still have Willi's dynamic dimensions
>> patch to deal with (sorry Willi, really).
>>
>> cheers
>> Paul
>>
>> 2009/7/7 Shane Stafford <shane at stellarscience.com>
>>>
>>> In some cases where a kd-tree is used to partition space between
>>> non-orthogonal dimensions (for example, on the surface of a sphere) it would
>>> be helpful to have two separate distance functors. I thought about
>>> implementing this myself, but then I also thought that to generalize it even
>>> more, perhaps the Accessor could be also be dispensed with, allowing both
>>> the compare and distance functors to take _Val arguments (instead of
>>> _Acc::result_type arguments).
>>>
>>> A tentative proposal:
>>>
>>> squared_distance_type distanceToDimension( _Val value1, _Val value2,
>>> size_t dimension );
>>>
>>> squared_distance_type distanceBetweenNodes( _Val value1, _Val value2 );
>>>
>>> bool compareInDimension( _Val value1, _Val value2, size_t dimension );
>>>
>>> I think the crux of the issue is that functions like
>>> _S_accumulate_node_distance make the assumption that the kd-tree is formed
>>> on a cartesian grid, and thus that the one-dimensional distances sum in a
>>> simple way to give the n-dimensional squared difference.
>>>
>>> To give a concrete example, consider a 2D mapping onto half of a unit
>>> sphere consisting of the zenith angle (-pi<phi<pi) and azimuthal angle
>>> (-pi/2<theta<pi/2). A kd-tree could be used to partition the space along phi
>>> and theta. The (great-circle) squared distance from any node to a given
>>> zenith phi is fairly easy to compute. The squared distance to a given
>>> azimuth is also straightforward to compute. However, these two quantities
>>> cannot be summed to give the distance along both dimensions (because the
>>> distance to an azimuth is dependent on the zenith angle). Also, it would be
>>> easier to optimize the trig by storing things like cos(phi) instead of phi
>>> itself, and having a compare functor that takes two node values and a
>>> dimension would allow optimization of the compares, which would use
>>> different arithmetic in different dimensions.
>>>
>>> Thoughts?
>>>
>>> Shane
>>>
>>>
>>> _______________________________________________
>>> libkdtree-devel mailing list
>>> libkdtree-devel at lists.alioth.debian.org
>>> http://lists.alioth.debian.org/mailman/listinfo/libkdtree-devel
>>
>>
>> _______________________________________________
>> libkdtree-devel mailing list
>> libkdtree-devel at lists.alioth.debian.org
>> http://lists.alioth.debian.org/mailman/listinfo/libkdtree-devel
>>
>>
>



More information about the libkdtree-devel mailing list