Skip to content

Commit

Permalink
Merge branch 'main' of github.com:msluszniak/scholar into main
Browse files Browse the repository at this point in the history
  • Loading branch information
msluszniak committed Sep 11, 2023
2 parents 65f7376 + 12258cb commit dd1a29c
Show file tree
Hide file tree
Showing 14 changed files with 1,297 additions and 228 deletions.
51 changes: 29 additions & 22 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,37 +1,44 @@
# Changelog

## v0.2.1-dev
## v0.2.2-dev

## v0.2.0 (2023-08-16)
## v0.2.1 (2023-08-30)

### Enhancements

* Remove `VegaLite.Data` in favour of future use of `Tucan`
* Do not use EXLA at compile time in `Metrics`

## v0.2.0 (2023-08-29)

This version requires Elixir v1.14+.

### Enhancements

* Update notebooks
* Add support for :f16 and :bf16 types in SVD
* Add Affinity Propagation
* Add t-SNE
* Add Polynomial Regression
* Replace seeds with Random.key
* Add support for `:f16` and `:bf16` types in `SVD`
* Add `Affinity Propagation`
* Add `t-SNE`
* Add `Polynomial Regression`
* Replace seeds with `Random.key`
* Add 'unrolling loops' option
* Add support for custom optimizers in Logistic Regression
* Add Trapezoidal Integration
* Add AUC ROC, AUC, and ROC Curve
* Add Simpson rule Integration
* Add support for custom optimizers in `Logistic Regression`
* Add `Trapezoidal Integration`
* Add `AUC-ROC`, `AUC`, and `ROC Curve`
* Add `Simpson rule integration`
* Unify tests
* Add Radius Nearest Neighbors
* Add DBSCAN
* Add classification metrics: Average Precision Score, Balanced Accuracy Score,
Cohen Kappa Score, Brier Score Loss, Zero-One Loss, Top-k Accuracy Score
* Add regression metrics: R2 Score, MSLE, MAPE, Maximum Residual Error
* Add support for axes in Confusion Matrix
* Add support for broadcasting in Metrics.Distances
* Add `Radius Nearest Neighbors`
* Add `DBSCAN`
* Add classification metrics: `Average Precision Score`, `Balanced Accuracy Score`,
`Cohen Kappa Score`, `Brier Score Loss`, `Zero-One Loss`, `Top-k Accuracy Score`
* Add regression metrics: `R2 Score`, `MSLE`, `MAPE`, `Maximum Residual Error`
* Add support for axes in `Confusion Matrix`
* Add support for broadcasting in `Metrics.Distances`
* Update CI
* Add Gaussian Mixtures
* Add Model selection functionalities: K-fold, K-fold Cross Validation, Grid Search
* Change structure of metrics in Scholar
* Add a guide with Cross-Validation and Grid Search
* Add `Gaussian Mixtures`
* Add Model selection functionalities: `K-fold`, `K-fold Cross Validation`, `Grid Search`
* Change structure of metrics in `Scholar`
* Add a guide with `Cross-Validation` and `Grid Search`

## v0.1.0 (2023-03-29)

Expand Down
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ Add to your `mix.exs`:
```elixir
def deps do
[
{:scholar, "~> 0.1"}
{:scholar, "~> 0.2.1"}
]
end
```
Expand All @@ -30,7 +30,7 @@ such as EXLA:
```elixir
def deps do
[
{:scholar, "~> 0.1"},
{:scholar, "~> 0.2.1"},
{:exla, ">= 0.0.0"}
]
end
Expand All @@ -51,7 +51,7 @@ To use Scholar inside code notebooks, run:

```elixir
Mix.install([
{:scholar, "~> 0.1"},
{:scholar, "~> 0.2.1"},
{:exla, ">= 0.0.0"}
])

Expand Down
11 changes: 5 additions & 6 deletions lib/scholar/cluster/affinity_propagation.ex
Original file line number Diff line number Diff line change
Expand Up @@ -116,6 +116,7 @@ defmodule Scholar.Cluster.AffinityPropagation do
iterations = opts[:iterations]
damping_factor = opts[:damping_factor]
self_preference = opts[:self_preference]
data = to_float(data)

{initial_a, initial_r, s, affinity_matrix} =
initialize_matrices(data, self_preference: self_preference)
Expand Down Expand Up @@ -307,14 +308,12 @@ defmodule Scholar.Cluster.AffinityPropagation do
end

defnp initialize_similarities(data, opts \\ []) do
{n, dims} = Nx.shape(data)
n = Nx.axis_size(data, 0)
self_preference = opts[:self_preference]
t1 = Nx.reshape(data, {1, n, dims}) |> Nx.broadcast({n, n, dims})
t2 = Nx.reshape(data, {n, 1, dims}) |> Nx.broadcast({n, n, dims})

dist =
(-1 * Scholar.Metrics.Distance.squared_euclidean(t1, t2, axes: [-1]))
|> Nx.as_type(to_float_type(data))
norm1 = Nx.sum(data ** 2, axes: [1], keep_axes: true)
norm2 = Nx.transpose(norm1)
dist = -1 * (norm1 + norm2 - 2 * Nx.dot(data, [1], data, [1]))

fill_in =
cond do
Expand Down
2 changes: 1 addition & 1 deletion lib/scholar/integrate/integrate.ex
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ defmodule Scholar.Integrate do
keep_axis: [
type: :boolean,
default: false,
doc: "If set to true, the axis which is reduced are kept."
doc: "If set to true, the axis which is reduced is kept."
]
]

Expand Down
Loading

0 comments on commit dd1a29c

Please sign in to comment.