WOLFRAM

Enable JavaScript to interact with content and submit forms on Wolfram websites. Learn how
Wolfram Language & System Documentation Center
Neural Network Layers

Neural Network Layers

Neural networks offer a flexible and modular way of representing operations on arrays, from the more basic ones like arithmetic, normalization and linear operation to more advanced ones like convolutional filtering, attention and recurrence.The Wolfram Language offers a powerful symbolic representation for neural network operations. Layers can be defined, initialized and used like any other language function, making the testing of new architectures incredibly easy. Combined in richer structures like NetChain or NetGraph , they can be trained in a single step using the NetTrain function.

Basic Layers

LinearLayer trainable layer with dense connections computing

SoftmaxLayer layer globally normalizing elements to the unit interval

Custom Layers

FunctionLayer net layer from a Wolfram Language function

CompiledLayer net layer from arbitrary compilable code

PlaceholderLayer net layer for undefined operation

Elementwise Layers

ElementwiseLayer apply a specified function to each element in a tensor

ThreadingLayer apply a function to corresponding elements in a tensor sequence

ParametricRampLayer leaky activation with a slope that can be learned

RandomArrayLayer sample from a univariate distribution at each element of a tensor

Structure Manipulation Layers

CatenateLayer   PrependLayer   AppendLayer   FlattenLayer   ReshapeLayer   ReplicateLayer   PaddingLayer   PartLayer   TransposeLayer   ExtractLayer

Array Operation Layers

NetArrayLayer embed a learned constant array into a NetGraph

SummationLayer   TotalLayer   AggregationLayer   DotLayer   OrderingLayer

Convolutional and Filtering Layers

ConvolutionLayer   DeconvolutionLayer   PoolingLayer   ResizeLayer   SpatialTransformationLayer

Recurrent Layers

BasicRecurrentLayer   GatedRecurrentLayer   LongShortTermMemoryLayer

Sequence-Handling Layers

UnitVectorLayer embed integers into one-hot vectors

EmbeddingLayer embed integers into trainable vector spaces

AttentionLayer trainable layer for finding parts of a sequence to attend to

SequenceLastLayer   SequenceMostLayer   SequenceRestLayer   SequenceReverseLayer   SequenceIndicesLayer   AppendLayer   PrependLayer

Training Optimization Layers

DropoutLayer   ImageAugmentationLayer

BatchNormalizationLayer   NormalizationLayer   LocalResponseNormalizationLayer

Loss Layers

CrossEntropyLossLayer   ContrastiveLossLayer   CTCLossLayer

MeanSquaredLossLayer   MeanAbsoluteLossLayer

Higher-Order Network Construction

NetMapOperator map over a sequence

NetMapThreadOperator map over multiple sequences

NetFoldOperator recurrent network that folds in elements of a sequence

NetBidirectionalOperator bidirectional recurrent network

NetNestOperator apply the same operation multiple times

Network Composition

NetChain chain composition of net layers

NetGraph graph of net layers

NetPairEmbeddingOperator train a Siamese neural network

NetGANOperator train generative adversarial networks (GAN)

Top [フレーム]

AltStyle によって変換されたページ (->オリジナル) /