基本信息
views: 0
![](https://originalfileserver.aminer.cn/sys/aminer/icon/show-trajectory.png)
Bio
Souhail Bakkali (male) is a Ph.D. student at the L3i laboratory of La Rochelle University since December 2019. His main thesis subject is about unified restructuring of heterogeneous and multi-modal content for interactive exploration. The principle consists in automatically extracting information from the content presented in the information systems (scan of documents, structured and unstructured information), to understand the interactions between visual and textual data to be able to propose new methods to reorganize the research space and to find a common representation space to perform the document image classification task. Because administrative documents are multimodal and are intended for sequential reading, he is currently studying cross-modality learning for contextualized comprehension on document components. The aim is to leverage the multimodal information learned across language and vision into a cross-modal feature fusion methodology. During his first 2 years of thesis at L3i, he published 3 papers on document image classification (1 conference: ICIP2020, 1 workshop: CVPRW2020, and 1 journal: IJDAR2021). He also published 1 paper during his M2-Internship at L3i (1 workshop: CBDAR2019)
Research Interests
Papers共 8 篇Author StatisticsCo-AuthorSimilar Experts
By YearBy Citation主题筛选期刊级别筛选合作者筛选合作机构筛选
时间
引用量
主题
期刊级别
合作者
合作机构
CoRR (2024)
Cited0Views0EIBibtex
0
0
Mouhamed Amine Bouchiha, Quentin Telnoff,Souhail Bakkali,Ronan Champagnat,Mourad Rabah,Mickaël Coustaty,Yacine Ghamri-Doudane
CoRR (2024)
Cited0Views0EIBibtex
0
0
Author Statistics
Co-Author
Co-Institution
D-Core
- 合作者
- 学生
- 导师
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn