Package: attention
Title: Self-Attention Algorithm
Version: 0.4.0
Authors@R: 
    person("Bastiaan", "Quast", , "bquast@gmail.com", role = c("aut", "cre"),
           comment = c(ORCID = "0000-0002-2951-3577"))
Description: Self-Attention algorithm helper functions and demonstration vignettes of increasing depth on how to construct the Self-Attention algorithm, this is based on Vaswani et al. (2017) <doi:10.48550/arXiv.1706.03762>, Dan Jurafsky and James H. Martin (2022, ISBN:978-0131873216) <https://web.stanford.edu/~jurafsky/slp3/> "Speech and Language Processing (3rd ed.)" and Alex Graves (2020) <https://www.youtube.com/watch?v=AIiwuClvH6k> "Attention and Memory in Deep Learning".
License: GPL (>= 3)
Encoding: UTF-8
RoxygenNote: 7.2.3
Suggests: covr, knitr, rmarkdown, testthat (>= 3.0.0)
VignetteBuilder: knitr
Config/testthat/edition: 3
NeedsCompilation: no
Packaged: 2023-11-10 00:29:14 UTC; bquast
Author: Bastiaan Quast [aut, cre] (<https://orcid.org/0000-0002-2951-3577>)
Maintainer: Bastiaan Quast <bquast@gmail.com>
Repository: CRAN
Date/Publication: 2023-11-10 03:10:02 UTC
Built: R 4.3.0; ; 2023-11-10 05:10:59 UTC; unix
