| Package | Description |
|---|---|
| greycat.ml.neuralnet.layer | |
| greycat.ml.neuralnet.loss | |
| greycat.ml.neuralnet.process |
| Modifier and Type | Method and Description |
|---|---|
ExMatrix |
Layer.forward(ExMatrix input,
ProcessGraph g) |
ExMatrix[] |
Layer.getLayerParameters() |
| Modifier and Type | Method and Description |
|---|---|
ExMatrix |
Layer.forward(ExMatrix input,
ProcessGraph g) |
| Modifier and Type | Method and Description |
|---|---|
void |
Loss.backward(ExMatrix actualOutput,
ExMatrix targetOutput) |
void |
AbstractValue.backward(ExMatrix actualOutput,
ExMatrix targetOutput) |
DMatrix |
Loss.forward(ExMatrix actualOutput,
ExMatrix targetOutput) |
DMatrix |
AbstractValue.forward(ExMatrix actualOutput,
ExMatrix targetOutput) |
| Modifier and Type | Method and Description |
|---|---|
ExMatrix |
ProcessGraph.activate(Activation activation,
ExMatrix input) |
ExMatrix |
ProcessGraph.add(ExMatrix matA,
ExMatrix matB) |
ExMatrix |
ProcessGraph.concatVectors(ExMatrix matA,
ExMatrix matB) |
static ExMatrix |
ExMatrix.createFromW(DMatrix w) |
ExMatrix |
ProcessGraph.elmul(ExMatrix matA,
ExMatrix matB) |
static ExMatrix |
ExMatrix.empty(int rows,
int column) |
ExMatrix |
ProcessGraph.expand(ExMatrix matA,
int numOfCol) |
ExMatrix |
ProcessGraph.mul(ExMatrix matA,
ExMatrix matB) |
ExMatrix |
ProcessGraph.oneMinus(ExMatrix matA) |
| Modifier and Type | Method and Description |
|---|---|
ExMatrix |
ProcessGraph.activate(Activation activation,
ExMatrix input) |
ExMatrix |
ProcessGraph.add(ExMatrix matA,
ExMatrix matB) |
DMatrix |
ProcessGraph.applyLoss(Loss lossUnit,
ExMatrix actualOutput,
ExMatrix targetOutput,
boolean calcForwardLoss) |
ExMatrix |
ProcessGraph.concatVectors(ExMatrix matA,
ExMatrix matB) |
ExMatrix |
ProcessGraph.elmul(ExMatrix matA,
ExMatrix matB) |
ExMatrix |
ProcessGraph.expand(ExMatrix matA,
int numOfCol) |
ExMatrix |
ProcessGraph.mul(ExMatrix matA,
ExMatrix matB) |
ExMatrix |
ProcessGraph.oneMinus(ExMatrix matA) |
Copyright © 2017. All rights reserved.