Seeking clarification of HSWNLib knn_query from the docs

About the Github doc examples.

I wanted to clarify and/or assert what the output means. Take the results of the example given with k=3.

{:ok, 
#Nx.Tensor<
   u64[1][3]
   [
     [2, 0, 1]
   ]
 >,
#Nx.Tensor<
   f32[1][3]
   [
     [5.0, 3281.0, 3445.0]
   ]
 >}
}

I understand the response is of type {:ok, label, distance}.
Is the label tensor response the list of “x” matching indexes of the HSWNLib.Index (as found by knn_search), and the corresponding “distance tensor” between the input and the indexes?

In this example, the data consists of 5 2d-vectors:

Nx.tensor(
    [
      [42, 42],
      [43, 43],
      [0, 0],
      [200, 200],
      [200, 220]
    ],
    type: :f32
  )

With a vector input of [1.0, 2.0], then intuitively the vector [0.0, 0.0] is the closest (L2 in mind) whose index is 2. If you set k=3, then the indexes 0 and 1 with values [42.0, 42.0] and [43.0, 43.0] seem intuitively close. Are the indexes of the Index the label response? (seems so on this example but the doc could probably develop a bit).

You then need to lookup into your Index (HNSWLib.Index.get_items) at theses indexes to find the matching data.

Then more generally, isn’t there a way to normalise the distances so that you could retain matching indexes less than a percentage of the max distance around an input rather a fixed number of findings?

And finally, given that you found say the 1st knn embedding, would you save into a database all the embeddings as a vector and query it? or use a vector extension for this? (pgvector or sqlite_vss)?

1 Like