Skip to content

Initializing multiple optimizers #1567

Answered by awni
stockeh asked this question in Q&A
Discussion options

You must be logged in to vote

One way to do it is with multiple optimizers. Using your code as an example:

import mlx.nn as nn
import mlx.core as mx
import mlx.optimizers as optim
from mlx.utils import tree_flatten, tree_unflatten

from tqdm import tqdm
from functools import partial
from typing import List


class MLP(nn.Module):
    def __init__(self, n_inputs: int, n_hiddens: List[int], n_outputs: int):
        super().__init__()

        self.layers = []
        ni = n_inputs
        for _, n_units in enumerate(n_hiddens):
            self.layers.append(nn.Linear(ni, n_units))
            self.layers.append(nn.Tanh())
            ni = n_units
        self.layers.append(nn.Linear(ni, n_outputs))

    def __call__(self

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@stockeh
Comment options

Answer selected by stockeh
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants