The EsnTorch Library: Efficient Implementation of Transformer-Based Echo State Networks.

ICONIP (7)(2022)

引用 0|浏览0
暂无评分
摘要
Transformer-based models have revolutionized NLP. But in general, these models are highly resource consuming. Based on this consideration, several reservoir computing approaches to NLP have shown promising results. In this context, we propose EsnTorch, a library that implements echo state networks (ESNs) with transformer-based embeddings for text classification. EsnTorch is developed in PyTorch, optimized to work on GPU, and compatible with the transformers and datasets libraries from Hugging Face: the major data science platform for NLP. Accordingly, our library can make use of all the models and datasets available from Hugging Face. A transformer-based ESN implemented in EsnTorch consists of four building blocks: (1) An embedding layer, which uses a transformer-based model to embed the input texts; (2) A reservoir layer, which can implements three kinds of reservoirs: recurrent, linear or null; (3) A pooling layer, which offers three kinds of pooling strategies: mean, last, or None; (4) And a learning algorithm block, which provides six different supervised learning algorithms. Overall, this work falls within the context of sustainable models for NLP.
更多
查看译文
关键词
reservoir computing, echo state networks, natural language processing (NLP), text classification, transformers, BERT, python library, Hugging Face
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要