-
Notifications
You must be signed in to change notification settings - Fork 200
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: broadcast vectors for grad calculation #1535
Conversation
nx/lib/nx/defn/grad.ex
Outdated
Expr.constant(%T{shape: shape, type: {:f, 32}, names: names}, float, []) | ||
case shape do | ||
%T{vectorized_axes: [_ | _]} = t -> | ||
Expr.tensor(Nx.fill(t, float, type: :f32)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should probably get rid of the names here too.
I also wonder if should move the check for vectorized_axes to constant
. Today if someone passes vectorized_axes, Expr.constant is broken. So maybe we should create a tensor if a vectorized axes is given to tensor?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done!
nx/lib/nx/defn/grad.ex
Outdated
@@ -338,6 +333,8 @@ defmodule Nx.Defn.Grad do | |||
@verify_grad Application.compile_env(:nx, :verify_grad, false) | |||
|
|||
defp update_grads(op, args, ans, g, _to_grad_ids, grads) do | |||
args = revectorize_args(args, ans) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would prefer to not revectorized everything on every operation. Is there any chance we could do in broadcast
only?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[unbroadcast(x, Nx.multiply(g, y), ans), unbroadcast(y, Nx.multiply(g, x), ans)]
Lines like this one make it so that g
is vectorized and y
is unvectorized but has axes with the same name, so things break there.
nx/lib/nx/defn/expr.ex
Outdated
@@ -1394,6 +1394,11 @@ defmodule Nx.Defn.Expr do | |||
|
|||
## Constant helpers and related optimizations | |||
|
|||
defp constant(%{vectorized_axes: [_ | _]} = out, number) do | |||
out = %{out | names: Enum.map(out.names, fn _ -> nil end)} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think this part should be done here, we should preserve the names. Sorry for the confusion.
nx/lib/nx/defn/grad.ex
Outdated
@@ -1343,9 +1334,77 @@ defmodule Nx.Defn.Grad do | |||
|
|||
## General helpers | |||
|
|||
defp unbroadcast(%{shape: shape} = x, res, %{shape: shape}), do: {x, res} | |||
defp revectorize_args(args, ans) do |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's only apply this if args has more than one element and there are vectorized axes.
Also please test x * sin(y)
where y is vectorized.
nx/lib/nx.ex
Outdated
@@ -4906,12 +4906,19 @@ defmodule Nx do | |||
|
|||
def devectorize(%T{shape: shape, names: names, vectorized_axes: vectorized_axes} = tensor, opts) | |||
when vectorized_axes != [] do | |||
opts = keyword!(opts, keep_names: true) | |||
opts = keyword!(opts, keep_names: true, drop_inner_names: false) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Revert.
nx/lib/nx/defn/expr.ex
Outdated
|
||
t when is_tuple(t) -> | ||
context = elem(t, 0).data.context | ||
|
||
tuple( | ||
expr(tuple_out(tuple_size(t)), context, :metadata, [Nx.devectorize(expr), metadata]), | ||
expr(tuple_out(tuple_size(t)), context, :metadata, [expr, metadata]), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Revert. devectorize with keep_names.
nx/lib/nx/defn/expr.ex
Outdated
axes = | ||
Keyword.values(vectorized_axes) ++ Tuple.to_list(shape) | ||
|
||
brackets = Enum.map(axes, &[?[, Integer.to_string(&1), ?]]) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we revert? 🤔
nx/lib/nx/defn/grad.ex
Outdated
%{} -> | ||
parent_vectorized_axes = compute_arg_vectorized_axes(t, vectorized_axes) | ||
|
||
nodes = Map.put(nodes, id, {Nx.devectorize(t, keep_names: true), parent_vectorized_axes}) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should not have anything vectorized here.
nx/lib/nx/defn/grad.ex
Outdated
recur_parents_tree(arg, {parents, nodes}) | ||
|
||
recur_parents_tree( | ||
Nx.devectorize(arg, keep_names: true), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should not need this either.
nx/lib/nx/defn/grad.ex
Outdated
{parents, nodes} | ||
|
||
%{} -> | ||
parent_vectorized_axes = compute_arg_vectorized_axes(t, vectorized_axes) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This may not be necessary.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've changed the compute_arg_vectorized_axes
function into a function that returns only the names (it's less assertive, but a tad cheaper), and that allows us to not need this remapping here.
It was needed for cases where t
has [x: 1] and vectorized_axes has [x: 2], for instance -- implicit broadcast situations.
nx/lib/nx/defn/grad.ex
Outdated
end | ||
|
||
defp revectorize_node(node, vectorized_axes) do | ||
vectorized_axes = compute_arg_vectorized_axes(node, vectorized_axes) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe we could already read the computed values from nodes
. Maybe.
nx/lib/nx/defn/grad.ex
Outdated
vectorized_axes = compute_arg_vectorized_axes(node, vectorized_axes) | ||
|
||
node | ||
|> Nx.devectorize(keep_names: false) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
They should all be devectorized.
nx/lib/nx/defn/grad.ex
Outdated
@@ -1343,9 +1424,34 @@ defmodule Nx.Defn.Grad do | |||
|
|||
## General helpers | |||
|
|||
defp unbroadcast(%{shape: shape} = x, res, %{shape: shape}), do: {x, res} | |||
defp unbroadcast(x, res, ans) do |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Revert these changes.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Beautiful!!!!! We can merge this and add the new ops later!
Co-authored-by: José Valim <[email protected]>
closes #1533