ICDAR 2023 (hybrid) – the 4th workshop on Intelligent Cross-Data Analytics and Retrieval

on .

 Call for Paper

ICMR 2023: 12-15 June 2023 Thessaloniki, Greece

ICDAR 2023 (hybrid) – the 4th workshop on Intelligent Cross-Data Analytics and Retrieval

https://www.xdata.nict.jp/icdar_icmr2023/

Data plays a critical role in human life. In the digital era, where data can be collected almost anywhere and anytime, people have access to a vast volume of real-time data that reflects their living environment differently. People can extract necessary information from these data to gain knowledge and become wiser.

Nevertheless, data often comes from multiple sources and only reflects a small part of the big puzzle of life. Despite potentially missing some pieces, the goal is to capture the puzzle's image with the available pieces. The more pieces of data we can collect and assemble within a given frame, the faster we can solve the puzzle. The challenge becomes even more significant when dealing with multimodal data, cross-domain and cross-platform problems. A multimodal data puzzle would be one where pieces have different shapes and sizes. A cross-domain puzzle would be one where the pieces come from distinct sub-pictures. Finally, a cross-platform puzzle would be one where the pieces assembled come from different puzzles. But in all these scenarios, you still have to put the pieces together to get the entire picture. The proposed research topic of "Intelligent Cross-Data Analysis and Retrieval" aims to advance the field of cross-data analytics and retrieval and contribute to developing a more intelligent and sustainable society. This workshop welcomes researchers coming from diverse domains and disciplines, such as well-being, disaster prevention and mitigation, mobility, climate change, tourism, healthcare, and more.

The proposed research topic of "Intelligent Cross-Data Analysis and Retrieval" aims to advance the field of cross-data analytics and retrieval and contribute to developing a more intelligent and sustainable society. This workshop welcomes researchers from diverse domains and disciplines, including well-being, disaster prevention and mitigation, mobility, climate change, tourism, healthcare, and more.

Example topics include, but are not limited to:

-       Event-based cross-data retrieval Data mining and AI technology.

-       Multimodal complex event processing

-       Transfer Learning and Transformers.

-       Multimodal self-supervised learning

-       Heterogeneous data association discovery.

-       Cross-datasets for Repeatable Experimentation.

-       Federated Analytics and Federated Learning for cross-data.

-       Privacy-public data collaboration.

-       Diverse multimodal data Integration.

-       Realization of a prosperous and independent region in which people and nature coexist.

-       Intelligent cross-data analysis applications from different domains

Paper submission:

- All papers must be formatted according to the ACM proceedings style.

- All technical content, including the main text, figures, and tables, without references, should be included within the page limit (eight pages for regular papers and four pages for short papers).

- For any over-length submission, we reserve the right to reject it outright on an administrative basis.

Important Dates:

- Feb. 28, 2023: Deadline for Paper Submission

- Mar. 31, 2023: Notification of Acceptance

- Apr. 20, 2023: Camera-Ready Paper Due

- (TBD) - Workshops Day

Organizers*

-  Guillaume Habault, KDDI Research Inc., Japan

-  Michael Alexander Riegler, Simula Metropolitan Center for Digital Engineering, Norway

-  Minh-Son Dao, National Institute of Information and Communications Technology (NICT), Japan

-  Duc-Tien Dang-Nguyen, Bergen University, Norway

-  Yuta Nakashima, Osaka University, Japan

-  Cathal Gurrin, Dublin City University, Ireland

Contact:

https://www.xdata.nict.jp/icdar_icmr2023/

https://icmr2023.org/

Mail: This email address is being protected from spambots. You need JavaScript enabled to view it. >

Mail:  This email address is being protected from spambots. You need JavaScript enabled to view it.