基本信息
views: 1
![](https://originalfileserver.aminer.cn/sys/aminer/icon/show-trajectory.png)
Bio
Humans perceive the world in rich visual detail. In just a fraction of a second, we not only detect the objects and people in our environment, but also quickly recognize people’s emotions, goals, actions, and social interactions. Detecting these higher level properties is extremely challenging even for state-of-the-art computer vision systems. How do humans extract all of this complex information with such speed and ease? My research aims to answer this question using a combination of human neuroimaging, intracranial recordings, machine learning, and behavioral techniques. Before joining Johns Hopkins, I was a postdoctoral researcher at MIT and Harvard in the Center for Brains, Minds, and Machines working with Nancy Kanwisher and Gabriel Kreiman. I completed my PhD at MIT where I was advised by Tomaso Poggio.
Research Interests
Papers共 71 篇Author StatisticsCo-AuthorSimilar Experts
By YearBy Citation主题筛选期刊级别筛选合作者筛选合作机构筛选
时间
引用量
主题
期刊级别
合作者
合作机构
Gunnar Blohm,Benjamin Peters,Ralf Haefner,Leyla Isik,Nikolaus Kriegeskorte, Jennifer S. Lieberman, Carlos R. Ponce,Gemma Roig,Megan A. K. Peters
arxiv(2024)
Cited0Views0Bibtex
0
0
CURRENT BIOLOGYno. 4 (2024): 931-933
Trends in cognitive sciencesno. 3 (2024): 195-196
Benjamin Peters,James J DiCarlo, Todd Gureckis,Ralf Haefner,Leyla Isik,Joshua Tenenbaum,Talia Konkle,Thomas Naselaris,Kimberly Stachenfeld, Zenna Tavares,Doris Tsao,Ilker Yildirim,
ArXiv (2024)
Cited0Views0WOSEIBibtex
0
0
Trends in Cognitive Sciencesno. 5 (2024): 392-393
Journal of Visionno. 9 (2023): 4628-4628
arXiv (Cornell University)pp.347-357, (2023)
JOURNAL OF NEUROSCIENCEno. 45 (2023): 7700-7711
Load More
Author Statistics
Co-Author
Co-Institution
D-Core
- 合作者
- 学生
- 导师
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn