免费领取大会全套演讲PPT    

报名领取

我要参会

Dingkun Long

Alibaba Tongyi Lab Algorithm Expert

Graduated from Beijing University of Aeronautics and Astronautics with a master's degree, he has been engaged in research and application landing related to natural language processing and information retrieval, and his research interests mainly include basic lexicography, information retrieval, and pre-training language models, etc. He has published several academic papers in international conferences ACL/EMNLP/SIGIR in related fields. Currently, he is committed to the research and development of algorithms related to Retrieval Augmentation for Large Models (RAG), including the development of the core module of retrieval augmentation and the exploration of high-performance solutions for the whole link. head of the open source project of GTE embedding series of models. Translated with DeepL.com (free version)

Topic

Large Model RAG Link Core Module Development and Practice

Recently, the emergence of big models has inspired the popularity of development tools similar to langchain and llamaindex, and simultaneously triggered the development paradigm and downstream applications based on big model RAG.The development and optimisation of core algorithmic modules in the RAG chain has also gradually become a hot topic of attention, such as embedding, ranking, and document understanding models. This talk will focus on the exploration, ideas and experience of Alibaba Tongyi Labs in the core technology modules of RAG link, including the development and practice of embedding and ranking models. Translated with DeepL.com (free version)

© boolan.com 博览 版权所有

沪ICP备15014563号-6

沪公网安备31011502003949号