In addition to the built-in classifiers and regressors, is it possible to specify a neural network with a custom structure (by specifying the layers) and use that for training and inference?

You can use a FullyConnectedNetworkClassifier or regressor, which support specifying the number of hidden layers and the number of hidden units in each layer.

There are ReLUs on every hidden layer. Other activations functions are not available.

Other network architectures would require using Metal or Accelerate

If you have a use case please file a ticket in Feedback Assistant.

Tagged with: