From a48e96564a7f92610ef9eed3f37b197db718983d Mon Sep 17 00:00:00 2001 From: Daniel Bevenius Date: Tue, 21 May 2024 07:26:42 +0200 Subject: [PATCH] docs: update blas notes Signed-off-by: Daniel Bevenius --- notes/blas.md | 9 +++++++-- 1 file changed, 7 insertions(+), 2 deletions(-) diff --git a/notes/blas.md b/notes/blas.md index 99a1f2c5..5496a662 100644 --- a/notes/blas.md +++ b/notes/blas.md @@ -29,12 +29,17 @@ C = alpha * A * B + beta * C ``` Where `A`, `B`, the input matrices that we want to multiply and and `C` is the resulting output matrix. `alpha` and `beta` are scalars and if we set them to -1 and 0 respectively, we get the standard matrix multiplication. +1 and 0 respectively, we get the standard matrix multiplication and initial +values of C are ignored. + But C might also be a non-zero matrix in which case beta will be applied before the addition of the result of the multiplication. Now, in the context of a neural network A might be the weights and B the -incoming activations. +incoming activations. Normally alpha and beta would be set to 1 and 0, but if +there is a skip connection (residual connection) then beta would be set to 1 +allowing the new activations to be added to the previous activations. + Example can be found in [gemm.c](../fundamentals/blas/openblas/src/gemm.c).