NeuralAttentionlib
Reusable functionality for defining custom attention/transformer layers.
NeuralAttentionlib.jl
aim to be highly extendable and reusable function for implementing attention variants. Will be powering Transformers.jl
.
Reusable functionality for defining custom attention/transformer layers.
NeuralAttentionlib.jl
aim to be highly extendable and reusable function for implementing attention variants. Will be powering Transformers.jl
.
Settings
This document was generated with Documenter.jl version 0.27.19 on Tuesday 5 July 2022. Using Julia version 1.7.3.