public class BatchNormalization extends BaseLayer<BatchNormalization>
Layer.TrainingMode, Layer.Type| Modifier and Type | Field and Description |
|---|---|
protected int |
index |
protected List<IterationListener> |
listeners |
protected org.nd4j.linalg.api.ndarray.INDArray |
std |
protected org.nd4j.linalg.api.ndarray.INDArray |
xHat |
protected org.nd4j.linalg.api.ndarray.INDArray |
xMu |
gradient, gradientsFlattened, gradientViews, optimizer, params, paramsFlattened, score, solver, weightNoiseParamscacheMode, conf, dropoutApplied, dropoutMask, epochCount, input, iterationCount, iterationListeners, maskArray, maskState, preOutput| Constructor and Description |
|---|
BatchNormalization(NeuralNetConfiguration conf) |
| Modifier and Type | Method and Description |
|---|---|
org.nd4j.linalg.api.ndarray.INDArray |
activate(boolean training)
Trigger an activation with the last specified input
|
org.nd4j.linalg.api.ndarray.INDArray |
activate(org.nd4j.linalg.api.ndarray.INDArray input,
Layer.TrainingMode training)
Initialize the layer with the given input
and return the activation for this layer
given this input
|
org.nd4j.linalg.api.ndarray.INDArray |
activate(Layer.TrainingMode training)
Trigger an activation with the last specified input
|
org.nd4j.linalg.primitives.Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> |
backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
Calculate the gradient relative to the error in the next layer
|
double |
calcL1(boolean backpropParamsOnly)
Calculate the l1 regularization term
0.0 if regularization is not used. |
double |
calcL2(boolean backpropParamsOnly)
Calculate the l2 regularization term
0.0 if regularization is not used. |
Layer |
clone()
Clone the layer
|
void |
fit(org.nd4j.linalg.api.ndarray.INDArray data)
Fit the model to the given data
|
int |
getIndex()
Get the layer index.
|
Collection<IterationListener> |
getListeners()
Get the iteration listeners for this layer.
|
int[] |
getShape(org.nd4j.linalg.api.ndarray.INDArray x) |
Gradient |
gradient()
Get the gradient.
|
boolean |
isPretrainLayer()
Returns true if the layer can be trained in an unsupervised/pretrain manner (AE, VAE, etc)
|
org.nd4j.linalg.api.ndarray.INDArray |
preOutput(org.nd4j.linalg.api.ndarray.INDArray x)
Classify input
|
org.nd4j.linalg.api.ndarray.INDArray |
preOutput(org.nd4j.linalg.api.ndarray.INDArray x,
boolean training)
Raw activations
|
org.nd4j.linalg.api.ndarray.INDArray |
preOutput(org.nd4j.linalg.api.ndarray.INDArray x,
Layer.TrainingMode training)
Raw activations
|
void |
setIndex(int index)
Set the layer index.
|
void |
setListeners(IterationListener... listeners)
Set the iteration listeners for this layer.
|
Layer |
transpose()
Return a transposed copy of the weights/bias
(this means reverse the number of inputs and outputs on the weights)
|
Layer.Type |
type()
Returns the layer type
|
accumulateScore, clear, clearNoiseWeightParams, computeGradientAndScore, fit, getGradientsViewArray, getOptimizer, getParam, getParamWithNoise, hasBias, initParams, iterate, layerConf, numParams, params, paramTable, paramTable, preOutput, score, setBackpropGradientsViewArray, setParam, setParams, setParams, setParamsViewArray, setParamTable, setScoreWithZ, toString, update, updateactivate, activate, activate, addListeners, applyConstraints, applyDropOutIfNecessary, applyMask, batchSize, conf, feedForwardMaskArray, getInput, getInputMiniBatchSize, getMaskArray, gradientAndScore, init, input, layerId, migrateInput, numParams, setCacheMode, setConf, setInput, setInputMiniBatchSize, setListeners, setMaskArray, validateInputequals, finalize, getClass, hashCode, notify, notifyAll, wait, wait, waitgetEpochCount, getIterationCount, setEpochCount, setIterationCountprotected int index
protected List<IterationListener> listeners
protected org.nd4j.linalg.api.ndarray.INDArray std
protected org.nd4j.linalg.api.ndarray.INDArray xMu
protected org.nd4j.linalg.api.ndarray.INDArray xHat
public BatchNormalization(NeuralNetConfiguration conf)
public double calcL2(boolean backpropParamsOnly)
LayercalcL2 in interface LayercalcL2 in class BaseLayer<BatchNormalization>backpropParamsOnly - If true: calculate L2 based on backprop params only. If false: calculate
based on all params (including pretrain params, if any)public double calcL1(boolean backpropParamsOnly)
LayercalcL1 in interface LayercalcL1 in class BaseLayer<BatchNormalization>backpropParamsOnly - If true: calculate L1 based on backprop params only. If false: calculate
based on all params (including pretrain params, if any)public Layer.Type type()
Layertype in interface Layertype in class AbstractLayer<BatchNormalization>public org.nd4j.linalg.primitives.Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
LayerbackpropGradient in interface LayerbackpropGradient in class BaseLayer<BatchNormalization>epsilon - w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.public void fit(org.nd4j.linalg.api.ndarray.INDArray data)
Modelfit in interface Modelfit in class BaseLayer<BatchNormalization>data - the data to fit the model topublic org.nd4j.linalg.api.ndarray.INDArray activate(boolean training)
Layeractivate in interface Layeractivate in class BaseLayer<BatchNormalization>training - training or test modepublic Gradient gradient()
ModelModel.computeGradientAndScore() .gradient in interface Modelgradient in class BaseLayer<BatchNormalization>public org.nd4j.linalg.api.ndarray.INDArray preOutput(org.nd4j.linalg.api.ndarray.INDArray x)
AbstractLayerpreOutput in interface LayerpreOutput in class AbstractLayer<BatchNormalization>x - the input (can either be a matrix or vector)
If it's a matrix, each row is considered an example
and associated rows are classified accordingly.
Each row will be the likelihood of a label given that examplepublic org.nd4j.linalg.api.ndarray.INDArray preOutput(org.nd4j.linalg.api.ndarray.INDArray x,
Layer.TrainingMode training)
LayerpreOutput in interface LayerpreOutput in class BaseLayer<BatchNormalization>x - the input to transformpublic org.nd4j.linalg.api.ndarray.INDArray activate(Layer.TrainingMode training)
Layeractivate in interface Layeractivate in class BaseLayer<BatchNormalization>training - training or test modepublic org.nd4j.linalg.api.ndarray.INDArray activate(org.nd4j.linalg.api.ndarray.INDArray input,
Layer.TrainingMode training)
Layeractivate in interface Layeractivate in class BaseLayer<BatchNormalization>input - the input to usetraining - train or test modepublic org.nd4j.linalg.api.ndarray.INDArray preOutput(org.nd4j.linalg.api.ndarray.INDArray x,
boolean training)
LayerpreOutput in interface LayerpreOutput in class AbstractLayer<BatchNormalization>x - the input to transformpublic Layer transpose()
Layertranspose in interface Layertranspose in class BaseLayer<BatchNormalization>public Layer clone()
Layerclone in interface Layerclone in class BaseLayer<BatchNormalization>public Collection<IterationListener> getListeners()
LayergetListeners in interface LayergetListeners in class AbstractLayer<BatchNormalization>public void setListeners(IterationListener... listeners)
LayersetListeners in interface LayersetListeners in interface ModelsetListeners in class AbstractLayer<BatchNormalization>public void setIndex(int index)
LayersetIndex in interface LayersetIndex in class AbstractLayer<BatchNormalization>public int getIndex()
LayergetIndex in interface LayergetIndex in class AbstractLayer<BatchNormalization>public boolean isPretrainLayer()
Layerpublic int[] getShape(org.nd4j.linalg.api.ndarray.INDArray x)
Copyright © 2018. All rights reserved.