cl-waffe

Extend library

All features of cl-waffe is exported for users, users can extend it as they wish.

The first section describes defmodel/deftrainer which will be most used macros.

The rest sections describe defnode/defoptimizer/defdataset which will be a little difficult for users to understand.

defmodel

To put it bluntly, what defmodel to cl-waffe is what class(nn.Modules): to PyTorch.

Internally, defmodel is a macro for just defining defstruct, but can be used like CLOS Style.

defmodel(name args &key (parameters nil) forward (optimize nil) (document An model, defined by cl-waffe))

This macro defines a cl-waffe model as name.

At the same time, a constructor name is defined and you can initialize your model like:

(cl-waffe.nn:LinearLayer 100 20) ; => [Model: Linearlayer]

name
Your model and constructor name
args
The arguments of a constructor
parameters

The parameters your model has.

Every time you initialize the model, the parameters are initialized.

Note that defmodel behaves like class.

The arguments are the same as defstruct

Format Example: ((param-name param-initial-value &key (type your-type)))

optimize
when t, your forward slot is defined with (declare (optimize (speed 3)(space 0)(debug 0))). It helps faster training after you ensured debugged.
forward

Define here the forward propagation of your model.

When backward, Automatic differentiation applies.

For example, defines seq2seq's encoder.

(defmodel Encoder (vocab-size embedding-dim hidden-size)
  :parameters ((embedding (Embedding vocab-size embedding-dim :pad-idx 0))
               (layer     (RNN embedding-dim hidden-size :num-layers 1)))
  :forward ((x)
	    (with-calling-layers x
	      (embedding x)
	      (layer x))))

(defmodel Encoder (vocab-size embedding-dim hidden-size) ~) says, The constructor of it is (Encoder vocab-size embedding-dim hidden-size) And these parameters will be used when initializing :parameters.

:parameters have Parameters of each object, that is, each time model is initialized, (Embedding ~) and (RNN ~) are created and inserted to Encoder's embedding, layer.

:forward defines forward-step.

There is no need to define :backward as automatic differentiation is enabled inside the defmodel. That is, in defmodel's forward, all calculations must be done in cl-waffe's APIs otherwise computation nodes would be broken.

Initialize and call model

Let's create encoder and call forward.

(setq model (Encoder 10 16 10))
;[Model: ENCODER]

(call model (!ones `(10 10)))
;#Const((((-2.31... 3.048... ~ 2.551... -2.98...)         
;                   ...
;         (-2.31... 3.048... ~ 2.551... -2.98...))        
;                 ...
;        ((-2.31... 3.048... ~ 2.551... -2.98...)         
;                   ...
;         (-2.31... 3.048... ~ 2.551... -2.98...))) :mgl t :shape (10 10 10))

(backward (!sum *))
; NIL
; Backward process is done correctly!

CLOS Style

(This is available in other macros as well.)

This is obviously but you can define method for each cl-waffe objects.

Each parameter can be accessed by using slot-value or each inherent accessor.

(defmethod print-object ((model Encoder) stream)
     (format stream "[Seq2Seq Encoder which contains: ~a and ~a]"
                    (slot-value model 'embedding)
		    (encoder-layer model)))
		    
(print model)
;[Seq2Seq Encoder which contains: [Model: EMBEDDING] and [Model: RNN]]

See also for more APIs: document

deftrainer

See: document.

defoptimizer

See: document.

defnode

defnode is a macro for defining a computation node itself in contrast to defmodel defining calculations using operators defined by defnode.

defnode requires :forward :backward to be fulfilled.

cl-waffe's APIs aren't necessary to be used in each step of :forward :backward as long as WaffeTensor is returned.

For example, defining (!transpose1 ...) without using cl-waffe's APIs.

(defnode Transpose1Tensor (shape)
  :optimize t
  :parameters ((prev-shape nil)(shape shape))
  :forward ((x)
	    (setf (self prev-shape)(!shape x))
	    (with-facet (array ((value x) 'array :direction :input))
	      (sysconst (array-to-mat (numcl:transpose array)))))
  :backward ((dy)
	     (list (!transpose1 dy (self prev-shape)))))

(defun !transpose1 (x &rest result)
   (call (Transpose1Tensor (assure-tensor result))(assure-tensor x)))

(setq a (!randn `(10 10)))
(setq a (!transpose1 a))
(print (cl-waffe::waffetensor-state a))
; [Node : TRANSPOSE1TENSOR]
; Backward created correctly.

mgl-mat provides an macro with-facet(See original repos), which is used to directly access cl's array etc.

In other example, defining dropout.

; An implementation of Inverted Dropout.
(defnode Dropout (&optional (dropout-rate 0.5))
  :optimize t
  :parameters ((dropout-rate
		(if (and (> dropout-rate 0.0)
			 (< dropout-rate 1.0))
		    dropout-rate
		    (error "cl-waffe.nn: Dropout(x), x must be in the range of 0.0<x<1.0 where x is a single-float.")))
	       (mask T))
  :forward ((x)
	    (if (eql (self mask) T) ; is first call?
		(setf (self mask)(!zeros (!shape x))))
	    
	    (if *no-grad* ; predict mode
		x
		(progn
		  (!modify (self mask) :bernoulli (self dropout-rate))
		  (!modify (!mul (self mask) x) :*= (/ 1 (- 1 (self dropout-rate)))))))

  :backward ((dy)
	     (list (!mul (self mask) dy))))

Tips: using T as a default parameter is convinient since cl-waffe's optimizer can detect discontinuities in the computation nodes.

See also for more APIs: document

defdataset

See: document.

Todo: parallelize/make its memory-usage less.