基本信息
views: 14
![](https://originalfileserver.aminer.cn/sys/aminer/icon/show-trajectory.png)
Bio
At MSRI, my research primarily focused on Cross Lingual Transfer in Pretrained Multilingual Language Models and understanding mechanisms behind in-context learning in transformers. In past I have also worked on analysis of computational capabilities of Transformers and Recurrent Neural Networks through understanding their behavior on several Formal Languages with Dr. Navin Goyal and on Controlled Text Generation for Syntactic Paraphrasing with Professor Partha Talukdar .
Research Interests
Papers共 18 篇Author StatisticsCo-AuthorSimilar Experts
By YearBy Citation主题筛选期刊级别筛选合作者筛选合作机构筛选
时间
引用量
主题
期刊级别
合作者
合作机构
CoRR (2024): 627-645
Cited0Views0EIBibtex
0
0
arxiv(2023)
Cited2Views0Bibtex
2
0
CoRR (2023): 4232-4267
International Conference on Computational Linguistics (2022): 4320-4335
PROCEEDINGS OF THE FIRST WORKSHOP ON EFFICIENT BENCHMARKING IN NLP (NLP POWER 2022) (2022): 64-74
Load More
Author Statistics
Co-Author
Co-Institution
D-Core
- 合作者
- 学生
- 导师
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn