谷歌浏览器插件
订阅小程序
在清言上使用

UniPT: Universal Parallel Tuning for Transfer Learning with Efficient Parameter and Memory.

CVPR 2024(2024)

引用 17|浏览47
关键词
Transfer Learning,Memory Efficiency,Pre-trained Network,Memory Consumption,Natural Language Processing Tasks,Intermediate Activity,Aggregation Module,Parallel Modules,Broad Application,Feature Maps,Feature Representation,Attention Mechanism,Trainable Parameters,Reduction Factor,Question Answering,Word Embedding,Domain Adaptation,Powerful Capability,Textual Features,Intermediate Features,Attention Layer,Multi-head Self-attention,Discriminative Representations,Transformer Layers,CNN Layers
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要