NN
model-list
model-list
Option | Value |
Constructor: | (model-list model-args &aux (mlist model-args)) |
Predicate: | model-list-p |
Copier: | copy-model-list |
Print Function: | (lambda (m stream k) (declare (ignore k)) (render-simple-model-structure stream m)) |
cl-waffe's Model: model-list
This structure is an cl-waffe object- Overview
- define model sequentially, (e.g. x = (sequence `((layer1)(layer2))), (call x 1 tensor) => layer1's output)
- How to Initialize
(model-list model1 model2 ...) => [MODEL: model-list]
- Forward
index
represents the index of models.args
is the arguments for index-th model.- Call Forward
(call (model-list) index &rest args)
- Object's slots
hide-from-tree
Option Value Type: boolean
Read Only: nil
Accessor: cl-waffe::model-list-hide-from-tree
Initform: nil
forward
Option Value Type: boolean
Read Only: nil
Accessor: cl-waffe::model-list-forward
Initform: t
backward
Option Value Type: boolean
Read Only: nil
Accessor: cl-waffe::model-list-backward
Initform: nil
parameters
Option Value Type: t
Read Only: nil
Accessor: cl-waffe::model-list-parameters
Initform: (quote (cl-waffe::mlist))
mlist
Option Value Type: t
Read Only: nil
Accessor: cl-waffe::model-list-mlist
Initform: cl-waffe::model-args
Linearlayer
linearlayer
Option | Value |
Constructor: | (linearlayer in-features out-features &optional (bias t) &aux (weight (parameter (!mul 0.01 (!randn (quasiquote (#S(comma :expr in-features :kind 0) #S(comma :expr out-features :kind 0))))))) (bias (if bias (parameter (!zeros (quasiquote (#S(comma :expr out-features :kind 0) 1)))) nil))) |
Predicate: | linearlayer-p |
Copier: | copy-linearlayer |
Print Function: | (lambda (m stream k) (declare (ignore k)) (render-simple-model-structure stream m)) |
Calling LinearLayer. Applies a linear transformation to the coming datum. y = xA + b
Args: in-features (fixnum) out-features (fixnum) bias (boolean)(See LinearLayer's document)
Input: x (Tensor) where the x is the shape of (batch-size in-features) Output: Applied tensor, where the tensor is the shape of (batch-size out-features)
hide-from-tree
Option Value Type: boolean
Read Only: nil
Accessor: cl-waffe.nn::linearlayer-hide-from-tree
Initform: nil
forward
Option Value Type: boolean
Read Only: nil
Accessor: cl-waffe.nn::linearlayer-forward
Initform: t
backward
Option Value Type: boolean
Read Only: nil
Accessor: cl-waffe.nn::linearlayer-backward
Initform: nil
parameters
Option Value Type: t
Read Only: nil
Accessor: cl-waffe.nn::linearlayer-parameters
Initform: (quote (cl-waffe.nn::weight cl-waffe.nn::bias))
weight
Option Value Type: cl-waffe:waffetensor
Read Only: nil
Accessor: cl-waffe.nn::linearlayer-weight
Initform: (cl-waffe:parameter (cl-waffe:!mul 0.01 (cl-waffe:!randn (sb-int:quasiquote (#S(sb-impl::comma :expr cl-waffe.nn::in-features :kind 0) #S(sb-impl::comma :expr cl-waffe.nn::out-features :kind 0))))))
bias
Option Value Type: t
Read Only: nil
Accessor: cl-waffe.nn::linearlayer-bias
Initform: (if cl-waffe.nn::bias (cl-waffe:parameter (cl-waffe:!zeros (sb-int:quasiquote (#S(sb-impl::comma :expr cl-waffe.nn::out-features :kind 0) 1)))) nil)
Denselayer
denselayer
Option | Value |
Constructor: | (denselayer in-features out-features &optional (bias t) (activation relu) &aux (layer (linearlayer in-features out-features bias)) (activation activation)) |
Predicate: | denselayer-p |
Copier: | copy-denselayer |
Print Function: | (lambda (m stream k) (declare (ignore k)) (render-simple-model-structure stream m)) |
Calling LinearLayer, and activation. Args: in-features (fixnum) out-features (fixnum) bias (boolean)(See LinearLayer's document)
activation: (symbol or function) the symbol is following: :relu :sigmoid :tanh :softmax when the activation is function, call this as activation. Input: x (Tensor) where the x is the shape of (batch-size in-features) Output: Applied tensor, where the tensor is the shape of (batch-size out-features)
hide-from-tree
Option Value Type: boolean
Read Only: nil
Accessor: cl-waffe.nn::denselayer-hide-from-tree
Initform: nil
forward
Option Value Type: boolean
Read Only: nil
Accessor: cl-waffe.nn::denselayer-forward
Initform: t
backward
Option Value Type: boolean
Read Only: nil
Accessor: cl-waffe.nn::denselayer-backward
Initform: nil
parameters
Option Value Type: t
Read Only: nil
Accessor: cl-waffe.nn::denselayer-parameters
Initform: (quote (cl-waffe.nn::layer cl-waffe.nn::activation))
layer
Option Value Type: t
Read Only: nil
Accessor: cl-waffe.nn::denselayer-layer
Initform: (cl-waffe.nn:linearlayer cl-waffe.nn::in-features cl-waffe.nn::out-features cl-waffe.nn::bias)
activation
Option Value Type: t
Read Only: nil
Accessor: cl-waffe.nn::denselayer-activation
Initform: cl-waffe.nn::activation
Dropout
dropout
Option | Value |
Constructor: | (dropout &optional (dropout-rate 0.5) &aux (dropout-rate (if (and (> dropout-rate 0.0) (< dropout-rate 1.0)) dropout-rate (error cl-waffe.nn: Dropout(x), x must be in the range of 0.0<x<1.0 where x is a single-float.))) (mask t)) |
Predicate: | dropout-p |
Copier: | copy-dropout |
Print Function: | (lambda (m stream k) (declare (ignore k)) (render-simple-model-structure stream m)) |
cl-waffe's Node: Dropout
This structure is an cl-waffe object- Overview
- Nothing
- Note
- Todo: docstring
- How to Initialize
(Dropout describe like &rest this) => [NODE: Dropout]
- Forward
- Nothing
- Call Forward
(call (Dropout) describe like &rest this)
- Backward description
- Nothing
- Call Backward
Note that: Parameters in node won't be updated.(call-backward (Dropout) dy)
- Object's slots
hide-from-tree
Option Value Type: boolean
Read Only: nil
Accessor: cl-waffe.nn::dropout-hide-from-tree
Initform: t
forward
Option Value Type: boolean
Read Only: nil
Accessor: cl-waffe.nn::dropout-forward
Initform: t
backward
Option Value Type: boolean
Read Only: nil
Accessor: cl-waffe.nn::dropout-backward
Initform: t
parameters
Option Value Type: t
Read Only: nil
Accessor: cl-waffe.nn::dropout-parameters
Initform: (quote (cl-waffe.nn::dropout-rate cl-waffe.nn::mask))
dropout-rate
Option Value Type: single-float
Read Only: nil
Accessor: cl-waffe.nn::dropout-dropout-rate
Initform: (if (and (> cl-waffe.nn::dropout-rate 0.0) (< cl-waffe.nn::dropout-rate 1.0)) cl-waffe.nn::dropout-rate (error "cl-waffe.nn: Dropout(x), x must be in the range of 0.0<x<1.0 where x is a single-float."))
mask
Option Value Type: t
Read Only: nil
Accessor: cl-waffe.nn::dropout-mask
Initform: t
BatchNorm2d
batchnorm2d
Option | Value |
Constructor: | (batchnorm2d in-features &key (affine t) (epsilon 1.0e-7) &aux (affine (if affine (linearlayer in-features in-features t) t)) (epsilon epsilon)) |
Predicate: | batchnorm2d-p |
Copier: | copy-batchnorm2d |
Print Function: | (lambda (m stream k) (declare (ignore k)) (render-simple-model-structure stream m)) |
cl-waffe's Model: BatchNorm2d
This structure is an cl-waffe object- Overview
- Nothing
- Note
- todo: docs
- How to Initialize
(BatchNorm2d describe like &rest this) => [MODEL: BatchNorm2d]
- Forward
- Nothing
- Call Forward
(call (BatchNorm2d) describe like &rest this)
- Object's slots
hide-from-tree
Option Value Type: boolean
Read Only: nil
Accessor: cl-waffe.nn::batchnorm2d-hide-from-tree
Initform: nil
forward
Option Value Type: boolean
Read Only: nil
Accessor: cl-waffe.nn::batchnorm2d-forward
Initform: t
backward
Option Value Type: boolean
Read Only: nil
Accessor: cl-waffe.nn::batchnorm2d-backward
Initform: nil
parameters
Option Value Type: t
Read Only: nil
Accessor: cl-waffe.nn::batchnorm2d-parameters
Initform: (quote (cl-waffe.nn::affine cl-waffe.nn::epsilon))
affine
Option Value Type: t
Read Only: nil
Accessor: cl-waffe.nn::batchnorm2d-affine
Initform: (if cl-waffe.nn::affine (cl-waffe.nn:linearlayer cl-waffe.nn::in-features cl-waffe.nn::in-features t) t)
epsilon
Option Value Type: float
Read Only: nil
Accessor: cl-waffe.nn::batchnorm2d-epsilon
Initform: cl-waffe.nn::epsilon
LayerNorm
Embedding
embedding
Option | Value |
Constructor: | (embedding vocab-size embedding-dim &key (pad-idx nil) &aux (vocab-size vocab-size) (embedding-dim embedding-dim) (padding-idx (if pad-idx (const pad-idx) (const -1))) (weights (parameter (!mul 0.01 (!randn (quasiquote (#S(comma :expr vocab-size :kind 0) #S(comma :expr embedding-dim :kind 0)))))))) |
Predicate: | embedding-p |
Copier: | copy-embedding |
Print Function: | (lambda (m stream k) (declare (ignore k)) (render-simple-model-structure stream m)) |
cl-waffe's Model: Embedding
This structure is an cl-waffe object- Overview
- Embedding
- How to Initialize
(Embedding vocab-size embedding-dim &key (pad-idx nil)) => [MODEL: Embedding]
- Forward
- Emm
- Call Forward
(call (Embedding) x)
- Object's slots
hide-from-tree
Option Value Type: boolean
Read Only: nil
Accessor: cl-waffe.nn::embedding-hide-from-tree
Initform: nil
forward
Option Value Type: boolean
Read Only: nil
Accessor: cl-waffe.nn::embedding-forward
Initform: t
backward
Option Value Type: boolean
Read Only: nil
Accessor: cl-waffe.nn::embedding-backward
Initform: nil
parameters
Option Value Type: t
Read Only: nil
Accessor: cl-waffe.nn::embedding-parameters
Initform: (quote (cl-waffe.nn::vocab-size cl-waffe.nn::embedding-dim cl-waffe.nn::padding-idx cl-waffe.nn::weights))
vocab-size
Option Value Type: fixnum
Read Only: nil
Accessor: cl-waffe.nn::embedding-vocab-size
Initform: cl-waffe.nn::vocab-size
embedding-dim
Option Value Type: fixnum
Read Only: nil
Accessor: cl-waffe.nn::embedding-embedding-dim
Initform: cl-waffe.nn::embedding-dim
padding-idx
Option Value Type: cl-waffe:waffetensor
Read Only: nil
Accessor: cl-waffe.nn::embedding-padding-idx
Initform: (if cl-waffe.nn::pad-idx (cl-waffe:const cl-waffe.nn::pad-idx) (cl-waffe:const -1))
weights
Option Value Type: t
Read Only: nil
Accessor: cl-waffe.nn::embedding-weights
Initform: (cl-waffe:parameter (cl-waffe:!mul 0.01 (cl-waffe:!randn (sb-int:quasiquote (#S(sb-impl::comma :expr cl-waffe.nn::vocab-size :kind 0) #S(sb-impl::comma :expr cl-waffe.nn::embedding-dim :kind 0))))))
RNN
rnn
Option | Value |
Constructor: | (rnn input-size hidden-size &key (num-layers 1) (activation tanh) (bias nil) (dropout nil) (biredical nil) &aux (rnn-layers (model-list (loop for i upfrom 0 below num-layers collect (rnnhiddenlayer input-size hidden-size nil activation activation bias bias dropout dropout)))) (num-layers num-layers) (hidden-size hidden-size) (biredical biredical) (wo (linearlayer hidden-size hidden-size))) |
Predicate: | rnn-p |
Copier: | copy-rnn |
Print Function: | (lambda (m stream k) (declare (ignore k)) (render-simple-model-structure stream m)) |
cl-waffe's Model: RNN
This structure is an cl-waffe object- Overview
- Todo: docs
- How to Initialize
(RNN describe like &rest this) => [MODEL: RNN]
- Forward
- Nothing
- Call Forward
(call (RNN) describe like &rest this)
- Object's slots
hide-from-tree
Option Value Type: boolean
Read Only: nil
Accessor: cl-waffe.nn::rnn-hide-from-tree
Initform: nil
forward
Option Value Type: boolean
Read Only: nil
Accessor: cl-waffe.nn::rnn-forward
Initform: t
backward
Option Value Type: boolean
Read Only: nil
Accessor: cl-waffe.nn::rnn-backward
Initform: nil
parameters
Option Value Type: t
Read Only: nil
Accessor: cl-waffe.nn::rnn-parameters
Initform: (quote (cl-waffe.nn::rnn-layers cl-waffe.nn::num-layers cl-waffe.nn::hidden-size cl-waffe.nn::biredical cl-waffe.nn::wo))
rnn-layers
Option Value Type: t
Read Only: nil
Accessor: cl-waffe.nn::rnn-rnn-layers
Initform: (cl-waffe:model-list (loop cl-waffe.nn::for cl-waffe.nn::i cl-waffe.nn::upfrom 0 cl-waffe.nn::below cl-waffe.nn::num-layers cl-waffe.nn::collect (cl-waffe.nn::rnnhiddenlayer cl-waffe.nn::input-size cl-waffe.nn::hidden-size nil :activation cl-waffe.nn::activation :bias cl-waffe.nn::bias :dropout cl-waffe.nn:dropout)))
num-layers
Option Value Type: t
Read Only: nil
Accessor: cl-waffe.nn::rnn-num-layers
Initform: cl-waffe.nn::num-layers
hidden-size
Option Value Type: t
Read Only: nil
Accessor: cl-waffe.nn::rnn-hidden-size
Initform: cl-waffe.nn::hidden-size
biredical
Option Value Type: t
Read Only: nil
Accessor: cl-waffe.nn::rnn-biredical
Initform: cl-waffe.nn::biredical
wo
Option Value Type: t
Read Only: nil
Accessor: cl-waffe.nn::rnn-wo
Initform: (cl-waffe.nn:linearlayer cl-waffe.nn::hidden-size cl-waffe.nn::hidden-size)
LSTM
GRU
MaxPooling
AvgPooling
Conv1D
Conv2D
Transformer
TransformerEncoderLayer
TransformerDecoderLayer
CrossEntropy
cross-entropy
(x y &optional (delta 1.0e-7) (epsilon 0.0))
SoftMaxCrossEntropy
softmax-cross-entropy
(x y &key (avoid-overflow t) (delta 1.0e-7) (epsilon 0.0))
MSE
mse
(p y)