Self-Attention algorithm helper functions and demonstration vignettes of increasing depth on how to construct the Self-Attention algorithm.
The package can be installed from CRAN using:
install.packages('attention')The development version, to be used at your peril, can be installed from GitHub using the remotes package.
if (!require('remotes')) install.packages('remotes')
remotes::install_github('bquast/attention')Development takes place on the GitHub page.
https://github.com/bquast/attention
Bugs can be filed on the issues page on GitHub.
https://github.com/bquast/attention/issues
Need a high-speed mirror for your open-source project?
Contact our mirror admin team at info@clientvps.com.
This archive is provided as a free public service to the community.
Proudly supported by infrastructure from VPSPulse , RxServers , BuyNumber , UnitVPS , OffshoreName and secure payment technology by ArionPay.