Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make nn algorithm configurable #281

Merged
merged 5 commits into from
Jun 14, 2024
Merged
Show file tree
Hide file tree
Changes from 4 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
54 changes: 44 additions & 10 deletions lib/scholar/manifold/trimap.ex
Original file line number Diff line number Diff line change
Expand Up @@ -113,6 +113,16 @@ defmodule Scholar.Manifold.Trimap do
doc: ~S"""
Metric used to compute the distances.
"""
],
algorithm: [
type: {:in, [:nndescent, :large_vis]},
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would add at least :brute. Maybe custom k-NN graph construction algorithm to be passed as a module as well.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there any benefit in using brute other than nndescent or large_vis?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd say it's the best one to use for smaller datasets.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perhaps making the selection automatic depending on dataset size (sample size and number of features) would be ideal.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I will force use brute for n < 100 and otherwise one of these two approximated algorithms, is it ok or do we want to add :brute anyway?

Copy link
Member

@krstopro krstopro Jun 11, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe use brute-force for $N \times D^2 \leq T$ for some constant $T$ (e.g. $10^5$ or $10^6$ or so).
I would add :brute anyway; it might be useful to see how much the quality of embeddings differ when approximate k-NN search algorithms are used.

Looking at it now, we might wanna change predict to transform in these algorithms as well.

Copy link
Contributor Author

@msluszniak msluszniak Jun 12, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

for $N = 20000$ and $D = 2$, we rather won't use brute, generally, it's more effective to have this condition only on N I guess, I can increase this to 500-1000

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sounds good to me.

default: :large_vis,
doc: ~S"""
Algorithm used to compute the nearest neighbors. Possible values:
* `:nndescent` - Nearest Neighbors Descent. See `Scholar.Neighbors.NNDescent` for more details.

* `:large_vis` - LargeVis algorithm. See `Scholar.Neighbors.LargeVis` for more details.
"""
]
]

Expand Down Expand Up @@ -290,15 +300,39 @@ defmodule Scholar.Manifold.Trimap do
num_points = Nx.axis_size(inputs, 0)
num_extra = min(num_inliners + 50, num_points)

nndescent =
Scholar.Neighbors.NNDescent.fit(inputs,
num_neighbors: num_extra,
tree_init?: false,
metric: opts[:metric],
tol: 1.0e-5
)
neighbors =
if Nx.axis_size(inputs, 0) <= 500 do
msluszniak marked this conversation as resolved.
Show resolved Hide resolved
model =
Scholar.Neighbors.BruteKNN.fit(inputs,
num_neighbors: num_extra,
metric: opts[:metric]
)

neighbors = nndescent.nearest_neighbors
{neighbors, _distances} = Scholar.Neighbors.BruteKNN.predict(model, inputs)
neighbors
else
case opts[:algorithm] do
:nndescent ->
nndescent =
Scholar.Neighbors.NNDescent.fit(inputs,
num_neighbors: num_extra,
tree_init?: false,
metric: opts[:metric],
tol: 1.0e-5
)

nndescent.nearest_neighbors

:large_vis ->
{neighbors, _distances} =
Scholar.Neighbors.LargeVis.fit(inputs,
msluszniak marked this conversation as resolved.
Show resolved Hide resolved
num_neighbors: num_extra,
metric: opts[:metric]
)

neighbors
end
end

neighbors = Nx.concatenate([Nx.iota({num_points, 1}), neighbors], axis: 1)

Expand Down Expand Up @@ -402,9 +436,9 @@ defmodule Scholar.Manifold.Trimap do
## Examples

iex> {inputs, key} = Nx.Random.uniform(Nx.Random.key(42), shape: {30, 5})
iex> Scholar.Manifold.Trimap.embed(inputs, num_components: 2, num_inliers: 3, num_outliers: 1, key: key)
iex> Scholar.Manifold.Trimap.transform(inputs, num_components: 2, num_inliers: 3, num_outliers: 1, key: key, algorithm: :nndescent)
"""
deftransform embed(inputs, opts \\ []) do
deftransform transform(inputs, opts \\ []) do
opts = NimbleOptions.validate!(opts, @opts_schema)
key = Keyword.get_lazy(opts, :key, fn -> Nx.Random.key(System.system_time()) end)
{triplets, opts} = Keyword.pop(opts, :triplets, {})
Expand Down
6 changes: 6 additions & 0 deletions lib/scholar/neighbors/brute_knn.ex
Original file line number Diff line number Diff line change
Expand Up @@ -32,6 +32,12 @@ defmodule Scholar.Neighbors.BruteKNN do

* `:cosine` - Cosine metric.

* `:euclidean` - Euclidean metric.

* `:squared_euclidean` - Squared Euclidean metric.

* `:manhattan` - Manhattan metric.

* Anonymous function of arity 2 that takes two rank-2 tensors.
"""
],
Expand Down
8 changes: 8 additions & 0 deletions lib/scholar/neighbors/utils.ex
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,14 @@ defmodule Scholar.Neighbors.Utils do
{:ok, &Scholar.Metrics.Distance.pairwise_minkowski(&1, &2, p: p)}
end

def pairwise_metric(:euclidean), do: {:ok, &Scholar.Metrics.Distance.pairwise_euclidean/2}

def pairwise_metric(:squared_euclidean),
do: {:ok, &Scholar.Metrics.Distance.pairwise_squared_euclidean/2}

def pairwise_metric(:manhattan),
do: {:ok, &Scholar.Metrics.Distance.pairwise_minkowski(&1, &2, p: 1)}

def pairwise_metric(metric) when is_function(metric, 2), do: {:ok, metric}

def pairwise_metric(metric) do
Expand Down
42 changes: 28 additions & 14 deletions test/scholar/manifold/trimap_test.exs
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,15 @@ defmodule Scholar.Manifold.TrimapTest do
test "non default num_inliers and num_outliers" do
x = Nx.iota({5, 6})
key = Nx.Random.key(42)
res = Trimap.embed(x, num_components: 2, key: key, num_inliers: 3, num_outliers: 1)

res =
Trimap.transform(x,
num_components: 2,
key: key,
num_inliers: 3,
num_outliers: 1,
algorithm: :nndescent
)

expected =
Nx.tensor([
Expand All @@ -26,14 +34,15 @@ defmodule Scholar.Manifold.TrimapTest do
key = Nx.Random.key(42)

res =
Trimap.embed(x,
Trimap.transform(x,
num_components: 2,
key: key,
num_inliers: 3,
num_outliers: 1,
num_random: 5,
weight_temp: 0.1,
learning_rate: 0.3
learning_rate: 0.3,
algorithm: :nndescent
)

expected =
Expand All @@ -53,13 +62,14 @@ defmodule Scholar.Manifold.TrimapTest do
key = Nx.Random.key(42)

res =
Trimap.embed(x,
Trimap.transform(x,
num_components: 2,
key: key,
num_inliers: 3,
num_outliers: 1,
num_iters: 100,
init_embedding_type: 1
init_embedding_type: 1,
algorithm: :nndescent
)

expected =
Expand All @@ -81,13 +91,14 @@ defmodule Scholar.Manifold.TrimapTest do
weights = Nx.tensor([1.0, 1.0, 1.0, 1.0, 1.0])

res =
Trimap.embed(x,
Trimap.transform(x,
num_components: 2,
key: key,
num_inliers: 3,
num_outliers: 1,
triplets: triplets,
weights: weights
weights: weights,
algorithm: :nndescent
)

expected =
Expand Down Expand Up @@ -116,12 +127,13 @@ defmodule Scholar.Manifold.TrimapTest do
])

res =
Trimap.embed(x,
Trimap.transform(x,
num_components: 2,
key: key,
num_inliers: 3,
num_outliers: 1,
init_embeddings: init_embeddings
init_embeddings: init_embeddings,
algorithm: :nndescent
)

expected =
Expand All @@ -141,12 +153,13 @@ defmodule Scholar.Manifold.TrimapTest do
key = Nx.Random.key(42)

res =
Trimap.embed(x,
Trimap.transform(x,
num_components: 2,
key: key,
num_inliers: 3,
num_outliers: 1,
metric: :manhattan
metric: :manhattan,
algorithm: :nndescent
)

expected =
Expand All @@ -170,7 +183,7 @@ defmodule Scholar.Manifold.TrimapTest do
assert_raise ArgumentError,
"Number of points must be greater than 2",
fn ->
Scholar.Manifold.Trimap.embed(x,
Scholar.Manifold.Trimap.transform(x,
num_components: 2,
key: key,
num_inliers: 10,
Expand All @@ -189,13 +202,14 @@ defmodule Scholar.Manifold.TrimapTest do
"Triplets and weights must be either not initialized or have the same
size of axis zero and rank of triplets must be 2 and rank of weights must be 1",
fn ->
Trimap.embed(x,
Trimap.transform(x,
num_components: 2,
key: key,
num_inliers: 3,
num_outliers: 1,
triplets: triplets,
weights: weights
weights: weights,
algorithm: :nndescent
)
end
end
Expand Down
Loading